• BLOG.png

    OmniBlog

OmniComm and eClinical Forum Release Survey Results on ICH-E6(R2) and RBM implementation

Posted by Abby Abraham on May 23, 2019 12:15:00 PM

Changes to ICH-GCP requirements— in the form of ICH-E6(R2) — and adoption by regulatory agencies are expected to impact processes and operations in the clinical research industry. A new survey from eClinical Forum, in partnership with OmniComm Systems Inc., highlights the current state of implementation of ICH-E6(R2) and RBM in various segments of the clinical research industry.  The survey results pointed to a sharp divergence in implementing ICH-E6(R2) and adopting risk-based approaches to monitoring between large sized and non-large organizations.

The author of this article presented interim study findings during the annual Society of Clinical Research Associates (SOCRA) meeting on September 29, 2018 in New Orleans, USA and at an eClinical Forum webinar in November 2018. This article presents the final survey conclusions.

Survey Objectives

The following were the key insights expected from the survey:

  • Current outlook of ICH-E6(R2) implementation in the industry.
  • Approach and implementation methodologies.
  • Major challenges in implementing ICH-E6(R2).
  • Role of technology in enabling ICH-E6(R2).

Survey Methodology

The survey was launched in June 2018 through eClinical Forum member channels and through an email survey campaign to professional contacts in the industry. The survey was conducted in a neutral manner irrespective of size, type or geographical basis. Respondents were anonymous. Efforts were made to reduce duplication of response from the same organization by filtering out more than one person from the same organization, prior to the campaign. More than 3,000 emails across 3,397 organizations were sent, and 211 respondents participated in the survey. Since the understanding of RBM is still evolving and there could be multiple approaches to implement RBM, the survey was flexible and enabled respondents to skip questions that they were not knowledgeable about or felt uncomfortable answering.  Hence, a response rate of 100% was not expected from survey participants.

Survey Analysis Framework

The survey results were analyzed from different perspectives:

  1. Overall response.
  2. Size of the organization – Grouped as large organizations (>2000 staff) and non-large organizations (<2000 staff).
  3. Type of organization – Sponsor (pharma, biotech and medical devices), CROs, Academic research organizations (AROs), consultants, site management organizations (SMOs), eClinical vendors and others.
  4. Geographical location of the organization’s headquarters: Americas, Europe, Middle East-North Africa and Asia-Pacific region.

Only significant differences (a change of 25% or higher relative to the overall response in different categories or type of respondents) have been reported in this article.

 

Focus of survey questions:

  • The number of survey questions had to be balanced in order to optimally engage, yet ensure completion of the survey by the respondents.
  • We sought to collect information about the respondent’s organizational approach to implementing ICH-E6(R2)/RBM even if the organization had not formally implemented an RBM process. As a result, some of the survey questions imply the desired or future state of RBM adoption and implementation.
  • Respondent profile:
    • Type of organization.
    • Size of organization.
    • Department or function.
    • Experience in RBM.
  • Key areas of the survey:
    • Function/department driving ICH-E6(R2)/ RBM implementation.
    • Implementation time frame.
    • Execution of risk planning.
    • Utilization of tools and technology in risk planning.
    • Risk-monitoring implementation.
    • Central-monitoring implementation.
    • Central-monitoring tool utilization.
    • The maturity of site/risk assessment and risk control.
    • Risk control and communications implementation.
    • Implementation barriers and support.

 

Survey Respondent Profile (211 Respondents)

Type of organization:    

  chart 1 abby rbm                                                            Function or department:

chart 2 abby rbm

Organization’s primary location                                               

 chart 3 abby rbm

Organization size by headcount

chart 4 abby rbm

Following is the analysis of the profile of respondents who participated in the survey.

Organizational Representation of Respondents

Representatives from sponsors and CROs (pharma, biotech and medical device organizations) accounted for the highest number (about 75%) of respondents. The remaining respondents represented academic research (11%) and other stakeholders (13%). The “Other” category included consultants, SMOs and eClinical vendors.

Functional Representation of Respondents

A significant number of respondents, (48% of survey pool), represented clinical operations. Clinical data management participants (24%) and quality management professionals (12%) also took part in this survey. A small proportion of respondents represented medical and scientific affairs (2%), information technology (6%), corporate (1%) and other functions (6%). The intent of capturing this information was to ensure that that the right stakeholders, those involved in execution of RBM/ICH-E6(R2), provided relevant input.

Primary Geographical Headquarters of the Respondent’s Organization

This was a global survey, with a healthy geographical distribution. Companies with headquarters in the Americas (North and South America) represented about 42% of the survey pool, followed by 31% from Europe and the Middle East, and 27% from organizations with headquarters in the Asia-Pacific region. Survey participants may have operated from offices in other geographical regions, but the survey sought to identify each organization’s primary location (headquarters). It was important to establish the location of corporate headquarters because decisions about RBM implementation are generally driven from the central corporate office.

Size of Respondent’s Organization

A majority of respondents (about 58%) constituted non-large organizations (small and mid-size organizations). About 42% of respondents represented larger organizations. The respondent distribution across different sizes of small/midsize organizations was equitable.

The following are the responses to the survey questions. The outcome of the sub-analysis of the results is provided where significant patterns or outliers are observed. The survey questionnaire allowed respondents to skip questions that they were not conversant with or not comfortable answering. Certain questions allowed respondents to choose multiple options.

Survey Questions:

The following survey questions were finalized based on discussions and agreement with members of the eClinical Forum. 

Q1. Is your company or organization currently using Risk-Based Monitoring (RBM)?

Chart Abby Q1 RBM

A majority of the respondents reported that their organization is currently using RBM (64%) and nearly one-third (29%) noted that their organization is not currently using RBM. A small percentage (6%) of respondents were not sure if their organization was using RBM.

From a geographical perspective, 76% of respondents from Europe and the Middle-East region reported that their organizations were implementing RBM, and that response was uniform across large and non-large organizations.

The Americas region had about 64% of organizations applying RBM. Organizations headquartered in the Asia-Pacific region had about 52% of organizations applying RBM. The common pattern of higher adoption of RBM in large organizations suggests general acceptance of RBM as a workable model.

 

Q2. What do you perceive to be the primary reason your company or organization has not yet implemented RBM?

Chart Abby Q2 RBM

 

For the 53 survey participants who responded that their organization has not implemented RBM, a follow-up question was provided to understand the reason for not implementing risk-based approaches to monitoring. About 40% of the 53 respondents felt that it was due to a lack of organizational readiness, such as SOPs and organizational structure. Respondents from non-large organizations selected this to be the primary reason. About 23% of respondents believed that challenges such as non-utilization of RBM tools and unavailability of IT infrastructure were barriers to RBM implementation. A small percentage of respondents (about 6%) felt that a lack of understanding about the regulatory guidelines and insufficient resources posed challenges to implement RBM.

 A significant number of respondents (about 25%) were grouped under the “Other” category. Analysis of the comments revealed different reasons, with some believing that the “gold standard within their organization was 100% SDV [source data verification]” and “their customers are small and they are quite focused on meeting their study objectives.”  A significant proportion of those who selected “Other” believed it was not applicable to them – e.g. eClinical vendors, SMOs etc. 

 

Q3. Which function within your organization is/will be primarily responsible for implementing ICH E6(R2) compliance?

Chart Abby Q3 RBM

A majority (about 51%) indicated that Clinical Operations is driving implementation of ICH-E6(R2) in their organization. Interestingly, Quality Management or the compliance function is also taking the lead in some organizations (24%). Ten percent of the survey pool, named Data Management Operations as the principle force behind ICH-E6(R2) implementation. A slender minority indicated that their organization is still determining which team will handle ICH-E6(R2) implementation and a few are looking at a new function to handle the implementation. Some of the respondents, (6%), selected the “Other” option. Those results were quite consistent across different types of organizations and across locations.

 

Q4.  How far has your organization commenced with the implementation of the ICH E6(R2)?

Chart Abby Q4 RBM

Given the favored status of ICH-E6(R2) with regulatory agencies, this question was intended to understand the pace at which organizations are adopting ICH-E6(R2). About 37% of the respondents conveyed that they have already implemented RBM and are planning to add new or more studies. Another 8% of the respondents said that they are piloting studies using RBM with new processes and SOPs. This is consistent with the response to the earlier question about whether they are applying risk-based approaches to monitoring. Additionally, the findings indicate that large organizations were already implementing or piloting studies using RBM. Non-large organizations contributed to the responses that indicated organizations were in early stages of implementation. This is especially true of small and midsized organizations from the Americas, Europe and the Middle East regions.

A significant group of respondents indicated that they have created or in the process of creating new processes and SOPs (22.73%). About 14% of the respondents conveyed that they are in very early stages of adoption as they were performing a gap analysis of current processes. About 18% had not commenced or were not sure how to go about using ICH-E6(R2). In total, about 55% of the respondents indicated that they have yet to implement ICH-E6(R2) in their organizations.

 

Q5. In what time frame is your organization expecting to become totally compliant with the ICH E6(R2)?

 

Chart Abby Q5 RBM

In line with the earlier questions, a significant proportion of respondents mentioned that they are already compliant (31%). Responses from non-large organizations represented more than half of the 31% who reported compliance. There was also a significant proportion of respondents (27%) who were not sure about when they expected ICH-E6(R2) to be implemented in their organizations. Further analysis revealed that primarily non-large sponsors, specifically respondents from biotechnology and medical device manufacturers contributed to this response about uncertainty. Cumulatively, about 36% of the respondents expect their organizations to be ready to implement ICH-E6(R2)  on or before 12 months from the date of this survey.        

Risk Planning

In this section of the survey, we wanted to understand different aspects related to or pertaining to execution of risk planning and assessment as stipulated in Section 5.0.1, 5.0.2 and 5.0.3 of ICH-E6(R2).

 

Q6. Who is/will be primarily responsible for performing risk planning on a study?

 Chart Abby Q6 RBM

From this question, we wanted to know who is primarily responsible for performing risk planning in the organization. The largest response (64%) indicated that the clinical study manager/lead or equivalent in clinical operations was responsible for performing risk planning. Significantly, quality or compliance managers (about 10%) are taking responsibility for risk-planning activities in organizations. To a lesser extent, data managers and lead CRAs were also taking the lead in risk planning, according to study results. Significantly, a small proportion of the respondents have yet to determine who will take the lead in the organization. Most respondents who selected this response were from small and midsized organizations. Among the small proportion of respondents who selected “Other,” a pattern that emerged is the approach to appoint a Risk Manager or Lead who specifically performs the risk-planning function.  No significant patterns were detected based on size or geography.

 

Q7. How frequently do/will different functional teams meet together to conduct risk planning and review on a study?

Chart 7 Abby Q 7 Rbm

The rationale behind this question was to understand how organizations are approaching the frequency of reviewing risks across functions in organizations. The risk-planning process can be complex when more stakeholders are involved and require continuous discussions. Therefore, understanding the frequency of such meetings can provide insights into the intensity and level of commitment from different functional teams.

A significant proportion of respondents (31%) reported that frequent meetings have yet to be determined. About 60% of that response came from respondents representing non-large organizations. Also, a higher proportion of this response came from Europe and the Middle East. For those respondents who are conducting cross-functional, risk-planning meetings, it is safe to infer that about 68% of the organizations were conducting risk-planning and review meetings every 12 weeks or less (2 weeks to 12 weeks). A small proportion of respondents (7%) indicated that their organizations have not applied a cross-functional, risk-planning process. 

 

Q8. What type of tool does/will your organization use for risk planning?

Chart 8 Abby Q8 RBM

The risk-planning process entails formal documentation of risks and assessing the identified risks by deriving a risk score based on certain organizational criteria. TransCelerate pioneered the introduction of a tool called Risk Assessment and Categorization Tool (RACT) in 2012. Since then, a variety of organizations have used and adapted this publicly available resource. It was pertinent to understand the extent to which the industry has adopted RACT.

A significant proportion of organizations utilize RACT – either in the original form that is currently available (12% of survey respondents) or a version modified to suit organizational needs (33%). Another significant section of the respondents (28%) were still determining the right tool to be used for risk planning. Some organizations (21%) have created internal tools or utilized a risk-planning tool from a vendor (6%).

 

Q9. Please select the features and functionalities your risk-planning tool achieves (or should achieve).  Select all that apply.

Chart 9 Abby

We posed this question to understand the functionalities provided or desired in a risk-planning tool. This is pertinent from a scalability perspective. The question allowed the respondents to choose multiple options or features that they use or desire.

The two most desired features selected by respondents in a risk-planning tool:

  • To have a continual and integrated approach to creating key risk indicators (KRIs) and link them to a dashboard and or alert (85 selections).
  • The ability to update risk plans and maintain different auditable versions of risk plans (81 selections).

The other features that found favor with the respondents included a system-validation requirement (72 selections), ability to record audit trail (71 selections) and a reusable risk library (70 selections).

A larger proportion of respondents representing CROs originating in the EU, Middle East and Asia Pacific, preferred the features that enabled creation of KRIs and dashboards as well as creating reusable organizational and program-level risk library.

 

Risk-Monitoring Control, Communications, Review and Reporting

In this section, we endeavored to understand different aspects pertaining to execution of risk monitoring as stipulated in Section 5.0.4, 5.0.5, 5.0.6 and 5.0.7 of ICH-E6(R2).

 

Q10: How does/will your organization measure and analyze operational/study performance and risks (i.e. Key performance indicators (KPIs) and Key risk indicators (KRIs)?

Chart 10 Abby

We posed this question to get a sense of how KRIs or KPIs are measured. It is recognized that given the recent formal introduction of ICH-E6(R2), a wide spectrum of organizations that utilize KRIs or KPIs could be expected. At one end of the spectrum, there could be organizations that are not familiar with defining and utilizing KRIs and KPIs. The other end includes progressive organizations that continually measure, analyze and adjust KRIs. Cumulatively, about 38% of the respondents of the survey indicated that they were in the early phase of understanding this methodology. This view is reflected in 6.5% of respondents who selected the option of not being familiar with KRIs/KPIs and another 16% who did not have standardized KRIs/KPIs.

 Also, 16% of the respondents wanted to first standardize a KPI/KRI library and then systematically utilize the KPIs/KRIs for further evaluations. A significant proportion of the respondents (31%) seemed to have a mature approach of analyzing and adjusting KRIs on an ongoing basis. Respondents from large sponsors that have achieved process maturation significantly contributed to this response. A significant proportion of respondents (28%) indicated that KRIs/KPIs exist for some operational needs and are sufficient. Non-large CROs contributed heavily to this response.

 

Q11. What is/will be your on-site monitoring strategy on studies?

Chart 11 Abby

 

Risk-based approaches to monitoring encompass on-site monitoring and central/remote monitoring. In the context of on-site monitoring, organizations could implement varying degrees of on-site monitoring strategies (primarily source data verification (SDV) and source data review). This survey question intended to understand the preferred strategy used by organizations. About 19% of the respondents indicated that they utilize 100% on-site monitoring. Another strategy followed by the industry is fixed monitoring, but reduced levels of SDV. In this survey, about 12% of the respondents indicated the application of this strategy. Targeted SDV (TSDV) encompasses a specific percent of data for verification. About 39% of the respondents indicated that they implement TSDV, with respondents from large organizations contributing significantly to that response. In all the three strategies, the percentage of SDV will be duly titrated based on the outcome of SDV. About 11% of the respondents indicated that they use other strategies. A review of the comments revealed that more customized adaptive approaches to source data verification and review based on risk propensity of sites and subjects are utilized.

 

Q12. What is/will be your organization's operational approach to performing Central Monitoring?

Chart 12 Abby

The second part of understanding the overall monitoring strategies entails execution of central monitoring. This survey question tries to understand the approaches used in central monitoring. A significant proportion of respondents (28%) revealed that they used a combination of clinical data monitoring and statistical review of data. Large organizations contributed more to this approach. The next largest response (17%) was the strategy to use in-house monitors to monitor data and follow up with sites remotely. The CROs (large and non-large in equal measure) contributed significantly to this response. About 16% of respondents indicated that they utilize a separate team of central monitors to review study data. Interestingly, a significant proportion of respondents (19%) revealed that they have yet to determine a central monitoring strategy. Respondents from non-large organizations, specifically sponsors, contributed primarily to this response. Other responses: utilizing pure data management (9%) and statistical teams (1%).

 

Q13. What are/will be the different data source systems that your organization considers essential to monitor data for implementing ICH E6(R2)? Select all that apply.

Chart 12 Abby

 

This survey question sought to understand the different data sources utilized to monitor data. Electronic data capture (EDC) system was the most commonly selected system to monitor data (113 selections). This was followed by clinical trial management system (CTMS) (85 selections) and a cluster of interactive web/phone randomization system (IXRS) (65 selections), Electronic Trial Master File (ETMF) (63 selections) and electronic patient reported outcome (ePRO systems) (58 selections). A small section of respondents selected laboratory and wearable device data source systems under the “Other” section. EDC continues to be seen as the primary data source for data monitoring with increasing inclusion of other systems.

 

Q14. What is/will be your organization's data-driven approach to performing Central Monitoring? Select all that apply.

Chart 12 Abby

Risk-based approaches to monitoring could utilize different types of central data monitoring or data-driven methods. It could be a single approach or a combination of multiple approaches. This question sought insight about common methodologies for data monitoring.

Site level trending and patterns of data across subjects and operational metrics seem to be the most favored approach by the respondents (85 selections). This was followed by an aggregate review of KRIs at site and study level (77 selections). Review of automated alerts or edit checks programed from data management or statistical teams and issued  from the system when a condition is met was also seen in good measure (71 selections). Respondents from large sponsors and non-large CROs contributed to more selections in the aforementioned options. A more granular level view of subject visits across all subjects was also of interest to respondents (52 selections). A small section of responses indicated that central monitoring methods have yet to be determined. Interestingly, predictive analytics is also one of the currently used/ to-be used approaches specified under “Other.”

 

Q15. What format of data does/will your organization use for central/remote monitoring? Select all that apply.

Q15 Abby

 

This question attempted to understand the means by which organizations monitor data. The spectrum of data review methodology could have conventional and basic level of CRF page review at one end and gradually progressing toward utilizing EDC data exports or line listing for review. More advanced approaches include the use of data visualizations and statistical tools. The respondents indicated that the most commonly used or desired approach to monitoring data was through EDC exports (79 selections) and using validated data visualizations (78 selections). A majority of respondents representing non-large organizations selected these options. Respondents from large organizations slightly favored data visualizations over the CRF exports option. Review of eCRF forms (62 selections) was also utilized and a review using statistical tools (66 selections) was also reported. A small proportion (16 selections) indicated that methodologies have yet to be determined, with non-large organizations representing the majority of the respondents in this segment.

 

Q16. How do/will central/remote monitors document data review activity?

 Q16 Abby

Central or remote data monitoring requires data to be in a place that is readily accessible. Different ways and approaches are utilized to document the review of data. This survey question generated an equitable distribution of responses. Answers included the use of: eCRFs as a primary means to record data review (19%), customized spreadsheets to document outcomes and observations (22%), digital tools to record review observation and outcomes (10%) and an eClinical system that captures observations and outcomes at a study, site, subject and visit level on different KRIs (22%). A significant proportion (23%) of respondents have yet to determine the approach to document the review. The “Other” section had responses that mentioned using Contact Reports or repurposing site-monitoring visit reports. Respondents from CROs selected using CRF pages as a tool and utilizing an eClinical vendor’s system to capture the review outcomes. Respondents from sponsors utilized customized spreadsheets or an eClinical vendor’s system. A high proportion of sponsors were in the yet-to-be determined category.

 

Q17: How do/will you primarily identify sites that have higher risk propensity? Select all that apply.

 Q17

In risk-based approaches to monitoring, understanding relative risks at sites is important. This helps in prioritizing sites, optimizing resources and monitoring efforts for downstream actions that are handled remotely or on-site. This survey question attempted to understand the approaches used by different clinical teams to make this assessment. The responses varied from basic, such as relying on monitoring visit reports, to more advanced statistical and risk-scoring frameworks. The survey also recognized the availability of multiple approaches, and therefore, respondents had the ability to choose more than one option. The most commonly selected option was to use the outputs from site-monitoring visits and central data monitoring (73 selections). A significant proportion of respondents utilized central data monitoring outcomes for sites and then used the outcomes to grade the risk propensity at sites (61 selections). A significant majority of respondents for this selection were from large organizations. Interestingly, an almost equivalent number of selections from respondents (57 selections) adopted the approach of utilizing a site risk-scoring framework and performing aggregate analysis of KRIs to grade sites based on perceived risks. More CROs, especially in the Americas, tend to use site risk-scoring framework and apply the aggregate analysis of KRIs. A smaller section of respondents (36 selections) used statistical models, and a few were still undecided (14 selections).

 

Q18. Does/will your organization set Quality Tolerance Limits (QTLs)?

Q18 Abby

Section 5.0.4 of ICH-E6(R2) states: “Predefined quality tolerance limits (QTLs) should be established, taking into consideration the medical and statistical characteristics of the variables as well as the statistical design of the trial, to identify systematic issues that can impact subject safety or reliability of trial results. Detection of deviations from the predefined quality tolerance limits should trigger an evaluation to determine if action is needed.” We designed this question to understand how organizations are approaching QTLs and the extent of implementation. In total, about 44% of respondents are applying or are in the process of implementing QTLs. Respondents from large organizations dominated that response. The majority of the respondents (56%) do not seem to be currently implementing or intending to apply QTLs. Primarily, respondents from non-large organizations indicated this option.

 

Q19. When issues are raised upon review of a KRI or critical data, do/will you have a tool that (select all that apply):

Q 19 Abby

 

This survey question was designed to understand functionalities in a tool that could document issues and actions taken during risk or data monitoring. In general, the respondents demonstrated equal preferences for the possible functionalities of such a tool. The most popular functionality (56 selections) named by respondents included the option to record issues and actions in one place (single source of truth) instead of having documentation in multiple systems.  Other noteworthy features:

  1. The ability to assign action items to other team members, especially when a team member is unavailable due to travels or on leave (47 selections).
  2. The ability to view the history or audit trail in order to understand points of delay in resolving issues (46 selections).
  3. Identifying issues or patterns at the site (46 selections).

A significant number of respondents (49 selections) have yet to determine the desired features. Respondents from non-large CROs were well-represented in the selection of this option. A small proportion indicated the need to verify if the issues observed during the course of the study are in line with the risks that were expected.

 

Q20. What are/will be the top challenges facing your organization in implementing RBM? Select all that apply.

 Q 20 Abby

This survey question was created to understand the barriers in implementing risk-based approaches to monitoring that enable ICH-E6(R2) compliance. Recognizing the multi-factorial nature of implementation challenges, the survey enabled respondents to choose multiple options.

The highest number of respondents (67 selections) believed that creating the right processes is a challenge. Respondents from non-large CROs and academic research organizations, especially from Europe, the Middle East and the Asia-Pacific Rim, contributed heavily to this option. A significant number (55 selections) also felt that they did not have people with the right skill set to perform risk planning, which is one of the most critical processes. Significant efforts in training and the complexity of repurposing existing staff to be able to operate in the new model were other significant pain points during implementation (46 selections). Interestingly, respondents from non-large organizations named most of those challenges.

There were also concerns about convincing clinical operations staff to reduce SDV levels at sites (47 selections), a step that some monitoring teams were not comfortable about taking. This hesitation may be due to the longstanding familiarity with the conventional practice of performing 100% SDV during monitoring visits and the fear of potentially missing out any study or data issues with subjects. There was also a section of respondents (37 selections) who felt that it was difficult to differentiate whether an alert was true, false or redundant (trying to filter out signals from data noise). A significant majority of respondents from the large organizations selected this factor. Understandably, large organizations are adopting automated technologies and must be feeling this pain point. Scalability of the processes using existing tools is another challenge that certain respondents experienced (36 selections). Under the “Other” section, a few other challenges surfaced:

  1. To get site team members to understand the new model and the support required for the same.
  2. Inability to seamlessly integrate systems for data accrual.
  3. Adjusting monitoring levels automatically.
  4. Lack of resources to perform data monitoring.
  5. Understanding that RBM is holistic across departments and not only a function of monitoring.

  

About eClinical Forum

eClinical Forum is a non-commercial forum founded in 2000 to promote open discussion and exchange in order to drive innovation and to maximize the success of eClinical initiatives. It is a global group run by its members for its members, with representatives from the Pharma, healthcare, regulatory, academic and support sectors.

 

About the Author

Abby AbrahamAbby Abraham is the vice president of Data Analytics and Risk-Based Monitoring at OmniComm Systems, Inc. Prior to joining OmniComm, he was the co-founder of Algorics and was responsible for ideation, design and operational delivery of Acuity, a best-in-class global analytics and RBM enablement platform for clinical research that was acquired by OmniComm Systems. In his current role, he enables ICH-E6(R2) implementation in organizations through optimal technology, process and staff interventions.

Abby is a pharmacologist by training and a healthcare management professional. He has more than 20 years of experience in the clinical research industry and has served in several senior leadership roles within global clinical operations at contract research organizations (CROs).

He actively participates in industry events by chairing, moderating and presenting scientific sessions in forums organized by The Society for Clinical Data Management (SCDM), DIA, Partnerships in Clinical Trials and other high-profile groups. Abby has also authored several articles in prestigious scientific publications.

Contributors

The author acknowledges Ken Light, executive vice president, Corporate Strategy, and Meredith Bell, clinical solutions consultant, both of OmniComm Systems, Inc., for their contributions to the design and execution of the survey featured in this article. The author also acknowledges the editorial assistance of Sharon Harvey Rosenberg, scientific journalist at OmniComm.

 

 

 

 

 

Tags: Risk-Based Monitoring, Centralized Monitoring, Acuity Analytics