Quality by Design - A proactive, risk-based approach to designing and conducting fit-for-purpose clinical trials
Blog author: Dr. Arati Borkar
Designing Clinical Trials Right — From the Start
Quality by Design (QbD), as articulated in ICH E8 (R1) and reinforced in ICH E6 (R3), emphasises that clinical trials must, fundamentally, be scientifically sound, operationally feasible, and capable of generating reliable data that protects participant safety and supports regulatory decision-making.
This approach requires a prospective and systematic view of quality, where trial design and execution are assessed in the context of being ‘ fit for purpose’ . Identifying factors that are critical to quality, both in processes and in data, is essential to ensure that clinical trial objectives can be met reliably.
Fit for Purpose and its Implications
‘Fit for purpose’ is a dynamic concept that varies by context. Different trial settings demand different design and operational approaches. For example, studies conducted in in-patient settings, outpatient (OPD-based) settings, or field-based environments each present distinct challenges. These are further influenced by therapeutic area, geographic location, cultural context, regional medical practices, and participant populations.
Careful attention to trial design is fundamental. The protocol remains the most critical document in any clinical research programme. It should clearly define measurable objectives, eligibility criteria, risk-adjusted and participant-centred procedures, along with mechanisms to minimise bias. Protocols that attempt to address multiple, overly complex objectives can increase the risk of poor data quality and compromise the ability to meet study objectives.
Operationalizing ‘Quality by Design’:
Study plans translate protocol requirements into operational detail, with a focus on managing ‘Critical to Quality’ (CtQ) processes and data within each functional scope. These include, but are not limited to, monitoring plans, data management plans, statistical analysis plans, laboratory manuals, pharmacy manuals, and safety reporting plans.
Where processes are interlinked or data exchange occurs across functions, cross-functional review and coordination are essential. Well-integrated processes and seamless data flow increase the likelihood of consistent and reliable study outcomes.
To meet study objectives, data acquisition tools must focus on relevant, decision-critical data. Deliberation among data managers and statisticians on what data are “must-have”, “good-to-have”, or unnecessary helps reduce site burden and supports efficient collection of high-quality data.
Computerised systems for data acquisition enable efficient data collection, real-time validation, improved accuracy, and enhanced data security. However, challenges arise when data originate from multiple sources, such as specialised laboratories, electronic diaries, or wearable devices. Proactively identifying and addressing these challenges is essential to ensure that integrated data remain reliable and usable for all stakeholders.
Core data quality attributes include:
- Consistency - uniformity of data ascertainment over time
- Accuracy - correctness of data collection, transmission, and processing
Completeness - minimisation of missing information
Risks to these attributes should be proactively identified and addressed to ensure that aggregated data provide actionable insights when large volumes of data are analysed.
Role-based system access, time-driven data entry controls, centralised data review, and audit trail assessment support early detection of emerging trends, missing data, inconsistencies, and protocol deviations, enabling proactive corrective action.
Equally important is the role of investigator sites. Appropriate site selection, training, and ongoing oversight are critical to protocol compliance and generation of good-quality data.
Effective Quality by Design cannot be achieved in isolation. Close collaboration and transparent communication among all stakeholders, including sponsors, CROs, investigator sites, specialised service providers, ethics committees, and regulatory authorities, from study inception through completion are fundamental to successful trial conduct.
Continuous coordination enables progress monitoring, early risk identification, timely mitigation, and prevention of serious non-compliance, all of which are essential to maintaining study quality.
In summary, Quality by Design is not a “box-ticking” exercise. It is a proactive, risk-based approach to identifying, implementing, and monitoring critical processes and data, with the objective of ensuring data reliability and protecting the rights, safety, and well-being of trial participants.
By focusing effort on what truly matters, Quality by Design supports meaningful study outcomes and informed decision-making.
Quality by Design in Practice at DiagnoSearch
At DiagnoSearch Life Sciences, Quality by Design informs how studies are planned, operationalised, and overseen across the trial lifecycle. Emphasis is placed on identifying what is truly critical to quality, integrating cross-functional perspectives early, and aligning processes and data collection with study objectives.
This approach supports consistent execution, reliable data generation, and proactive risk management, enabling sponsors to make confident, evidence-based decisions while maintaining a strong focus on participant safety and regulatory expectations.
If you are evaluating how to strengthen quality across your clinical development programmes, reach out to our team at: contactus@diagnosearch.com
