1 Policy summary
The data quality policy of Rotherham, Doncaster and South Humber NHS Foundation Trust (RDaSH) outlines the commitment to maintaining accurate, complete, timely, relevant, and accessible data across all organisational systems to support safe patient care, effective service delivery, compliance, and planning.
The policy defines responsibilities for all colleagues, directorates, and system owners to ensure data integrity with regular audits, kite marking, validation checks, incident reporting, and national standards compliance.
Data quality is monitored through dashboards, spot checks, and governance processes.
2 Introduction
Data quality is defined as the degree to which data meets an organisation’s expectations of accuracy, validity, completeness, and consistency. It is a critical aspect of data management, ensuring that the data used for analysis, reporting, and decision-making is reliable and trustworthy.
By tracking data quality, an organisation can pinpoint potential issues harming quality and ensure that shared data is fit to be used for a given purpose.
When collected data fails to meet an organisation’s expectations of accuracy, validity, completeness, and consistency, it can have significant negative impacts on service delivery, employee productivity, and key strategies.
In this context, poor data quality can result in clinical risks, financial loss, inefficiency, and legal consequences. The Data Protection Act (2018) mandates that personal information must be accurate and current, across both electronic and paper records.
Information is vital in supporting the trust to achieve its goals.
3 Purpose
The purpose of this policy is to set out the trust’s data quality principles. These principles will be adopted and supported by data quality procedures.
4 Scope
The policy applies to all colleagues and all data entered in organisational systems.
For further information about responsibilities, accountabilities and duties of all colleagues, please see appendix A.
5 Procedure
5.1 Quick guide
5.1.1 Governance
- Data integrity is to be ensured at the point of collection.
- Data quality forms part of the broader Information governance policy and management framework.
- Key organisational systems are monitored by the information quality work programme aligned to the integrated quality performance report (IQPR) and data validation procedure.
5.1.2 Storage
Personal data storage, processing, and reporting must follow the Data Protection Act (2018) and Trust information governance policies.
5.1.3 System owners
A system owner is a team or colleagues appointed by the trust who are responsible for configuration, maintenance and reporting of a digital system.
5.1.4 Incident reporting
All data errors must be reported via the incident reporting system. Prompt corrective action is required.
5.1.5 Training
Each system should have specific training available for users to access accordingly.
5.2 Governance framework
Data quality forms part of the broader information governance policy and management framework where roles and responsibilities are defined. Data integrity is to be ensured at the point of collection.
5.3 Data standards
Clinical data must conform to NHS Data Model and Dictionary Service, including:
- ICD: diagnoses
- OPCS: procedures
- SNOMED CT: clinical terminology
Where national standards are unavailable, local standards will be created and reviewed annually.
5.4 Data storage
Personal data storage, processing, and reporting must follow the Data Protection Act (2018) and trust information governance policies. Colleagues should refer to data security and protection breaches or information governance incident reporting policy for further guidance.
5.5 Data quality measurement
The Information Quality team (IQT) will perform data quality checks for key performance indicators (KPIs) featured in the IQPR. For each key performance indicator an assessment is undertaken to identify the level of compliance against each of the following six principles of data quality and a kite-mark applied (see Appendix C). However, system owners should ensure that these quality measures are adhered to:
- timeliness: this is the time taken between an event and when its data is available for use, acceptable lag varies by performance indicator, but data should be captured promptly and made available quickly enough to support decision-making and service management
- monitoring: refers to the degree the trust can explore data to assess performance, the required level of detail varies by indicator, some data must be patient level, while other metrics may be sufficient at specialty or trust level
- completeness: refers to both the presence of all required data fields and the inclusion of all relevant records for the target population
- validation: is the process of ensuring data accuracy and compliance with rules, the required validation level varies by indicator and risk, with final approval marking sufficient validation
- accuracy: data accuracy refers to how well data reflects real-world scenarios, making it trustworthy for insights and decisions. As it can change over time, regular monitoring is essential to detect and correct inaccuracies
- accessibility: means how easily users can access, process, and understand data, supported by effective tools and processes that enable data recording and retrieval without barriers
5.6 Validation and assurance
Data is monitored through routine reporting and exception management. Validation rules are embedded into systems and supported by training and reports. Data issues are reviewed by the respective directorates and raised through the appropriate clinical, operational, and governance structures.
The data quality group provides a forum to escalate and explore data quality issues.
5.7 Information management and business intelligence (IMBI) spot checks
Monthly reviews of high-profile indicators ensure accurate reporting and system use. Errors are corrected, and findings shared with relevant teams.
The Information Management and Business Intelligence team monitor the data quality maturity index (DQMI). The data quality maturity index wis a monthly publication about data quality in the NHS which provides data submitters with timely and transparent information.
5.8 Incident reporting
All data errors must be reported via the incident reporting system. Prompt corrective action is required. System training and guidance aim to prevent recurrence.
5.9 Audits
Audits (clinical, internal, and external) are used to improve processes, confirm data integrity, and inform targeted training. Audit scope is defined prior to commencement. Regular internal and external audits ensure compliance and validate data quality.
6 Training implications
There are no specific training needs in relation to this policy, but owners and users of digital systems need to be familiar with its contents: and any other individual or group with a responsibility for implementing the contents of this policy.
Each system should have training available for users to access accordingly.
Policy dissemination channels:
- Radar
- staff app
- monthly LEARN events
- intranet
- practice development days
- local induction
7 Equality impact assessment screening
To access the equality impact assessment for this policy, please email rdash.equalityanddiversity@nhs.net to request the document.
7.1 Privacy, dignity and respect
The NHS Constitution states that all patients should feel that their privacy and dignity are respected while they are in hospital. High Quality Care for All (2008), Lord Darzi’s review of the NHS, identifies the need to organise care around the individual, “not just clinically but in terms of dignity and respect”.
As a consequence the trust is required to articulate its intent to deliver care with privacy and dignity that treats all service users with respect. Therefore, all procedural documents will be considered, if relevant, to reflect the requirement to treat everyone with privacy, dignity and respect, (when appropriate this should also include how same sex accommodation is provided).
7.1.1 How will this be met
No issues have been identified in relation to this policy.
7.2 Mental Capacity Act (2005)
Central to any aspect of care delivered to adults and young people aged 16 years or over will be the consideration of the individuals’ capacity to participate in the decision-making process. Consequently, no intervention should be carried out without either the individual’s informed consent, or the powers included in a legal framework, or by order of the court.
Therefore, the trust is required to make sure that all staff working with individuals who use our service are familiar with the provisions within the Mental Capacity Act (2005). For this reason all procedural documents will be considered, if relevant to reflect the provisions of the Mental Capacity Act (2005) to ensure that the rights of individual are protected and they are supported to make their own decisions where possible and that any decisions made on their behalf when they lack capacity are made in their best interests and least restrictive of their rights and freedoms.
7.2.1 How will this be met
All individuals involved in the implementation of this policy should do so in accordance with the principles of the Mental Capacity Act (2005).
8 Links to any associated documents
The completed equality impact assessment for this policy has been published on this policy’s webpage on the trust policy website.
9 References
- The Data Protection Act (2018)
- NHS England Commissioning for Quality and Innovation (CQUIN)
- Information Commissioner’s Office. Freedom of Information or Data protection
- Data Security and Protection Toolkit NHS England
- NHS Data Model and Dictionary
- NHS Digital Mental Health Services Data Set
10 Appendices
10.1 Appendix A responsibilities, accountabilities and duties
10.1.1 The trust
The trust has a duty of care and a duty of confidentiality to ensure that all aspects of record keeping are properly managed. The trust must adhere to the legislative, statutory, and good practice guidance requirements relating to record management.
10.1.2 The chief executive
The chief executive has overall accountability and responsibility for records within the trust. This function is delegated to respective executives and system owners. Responsibility for healthcare records is delegated to the executive medical director and the executive director of nursing and allied health professionals, who are responsible for driving high quality standards of healthcare record keeping.
10.1.3 The trust’s executive medical director
The trust’s executive medical director (and trust Caldicott guardian) plays a key role in ensuring that NHS and partner organisations comply with current national guidance and relevant legislation regarding the handling and safeguarding of patient identifiable information. The Caldicott guardian will advise colleagues on matters relating to the management of patient identifiable information, for example where issues such as the public interest conflicts with duties such as maintaining confidentiality.
10.1.4 The director of health informatics
The director of health informatics will oversee development and implementation of data quality policies and promote a data quality culture in addition to supporting the wider data quality of the organisation through the trust’s data warehouse and business intelligence reporting solutions; supported by the wider health informatics portfolio, predominantly by the Information Management and Business Intelligence team and Information Quality (IQ) teams.
10.1.5 Each trust executive
Each trust executive has a responsibility to ensure that systems owned by their directorates are managed in accordance with this policy and adhere to the complete, accurate, relevant, accessible, and timely (CARAT) principles.
10.1.6 Senior managers of the trust
Senior managers of the trust are responsible for the quality of data generated in their areas by colleagues in their teams.
10.1.7 Head of contracting and performance
Head of contracting and performance provides guidance and support across a range of clinical data collection processes and advises on data quality improvements or changes necessary for reporting on the current and developing performance measures, such as Commissioning for Quality and Innovations (CQUINs) and key performance indicators. The individuals support with actively monitoring, commentating on and supporting staff to improve performance trends.
10.1.8 Head of information management and business intelligence
The head of information management and business intelligence (IMBI) will advise the trust on how to maintain an efficient and effective patient information system, which complies with all the data collections required within the NHS.
10.1.9 Directorates
- Ensure systems support accurate and timely data collection.
- Ensure representation at the data quality group.
- To engage with the Performance and Reporting teams in system and reporting developments including agreeing signoff of the final product.
- Promote and support the use of superusers to further embed good data quality across systems.
10.1.10 System owners
A system owner is a team or colleagues appointed by the trust who are responsible for configuration, maintenance and reporting of a digital system:
- configure systems in line with standards and promote data accuracy
- perform adequate testing (peer review, clinical assurance, user acceptance testing (UAT)
- provide comprehensive training with competency assessments
- offer end-user support aligned with data quality principles
10.1.11 Clinical and research governance
Depend on accurate data to uphold standards in clinical care, research quality, and patient experience. Refer to the research governance policy.
10.1.12 Business and performance management
Accurate data supports internal operations, external contracts, service planning, and new service development.
10.1.13 National requirements
The trust complies with national standards and uses a submission assurance framework for national submissions (staff access only).
10.2 Appendix B monitoring arrangements
10.2.1 All organisational systems
- How:
- information quality work programme aligned to the integrated quality performance report (IQPR)
- each system owner to ensure user and compliance data is fed through directorate structures and relevant sub-clinical leadership executive (CLE) groups for oversight
- data validation procedure
- Who:
- digital transformation group
- information governance and cyber group
- data quality group
- Reported to: Finance, Digital and Estates Committee
- Frequency: bimonthly
10.3 Appendix C kite marking methodology framework
10.3.1 Introduction and background
In order to meet the ever increasing financial, performance and quality challenges faced by many organisations, provider boards and commissioning bodies require access to high quality data to ensure that business critical decisions are based on the right information and at the right time.
Poor quality data could have an adverse impact on an organisation’s ability to commission and manage services effectively. Therefore, it is important for organisations to identify whether the systems and data, on which assurances around performance are founded, are fit for purpose.
A data quality kite mark can be the principle device to assure an organisation of the quality of the information reported to the board. Although there is evolving development of data quality kite mark application, there is no common approach across the NHS.
We introduced data quality kite marking in 2020 based on guidance from 360 Assurance. Our kite marking remit and methodology has been refined for 2024 and 2025 following extensive scoping to determine how other trusts apply data quality kite marks.
10.3.2 Definition of data quality
Data is of high quality if it is fit for its intended uses in operations, decision-making and planning. Within an organisation, acceptable data quality is crucial to operational and transactional processes and to the reliability of business analytics or business intelligence reporting. Data quality is affected by the way the data is entered, stored, analysed, managed and reported.
10.3.3 Why is data quality important?
Acceptable data quality is crucial to operational and transactional processes and to the reliability of business analytics and business intelligence reporting. High quality information leads to better decision-making to improve patient care and patient safety, and there are potentially serious consequences if information is not correct and up to date. Management information produced from patient data is essential for the efficient running of the trust and to maximise utilisation of resources for the benefit of patients and staff. Poor data quality puts organisations at significant risk of: damaging stakeholder trust; weakening frontline service delivery; incurring financial loss; and poor value for money.
10.3.4 What is kite marking?
A data quality kite mark is a visual indicator that acknowledges the variability of data and makes an explicit assessment of the quality of data on which the performance measurement is based. It is designed to appear next to key performance indicators (KPIs) included within performance reporting to provide assurance on the data quality.
10.3.5 Kite marking methodology
Each key performance indicator is assessed via a two-phased approach. Phase one assesses against four dimensions of data quality; timeliness, monitoring, completeness and validation. Phase two assesses against three dimensions of data quality; accuracy, completeness and accessibility. For each dimension assessed, a RAG rated outcome, which shows the strength of the assurance, is determined:
- limited assurance (red)
- moderate assurance (amber)
- high Assurance (green)
The breakdown of ratings, across the two phases, is represented via a donut chart. The outcome ratings also have accompanying scores that contribute to an overall data quality percentage rating.
10.3.6 Dimensions of data quality
10.3.6.1 Timeliness
This is the time taken between the end of the data period and when the information can be produced and reviewed. The acceptable data lag will be different for different performance indicators. Data should be captured as quickly as possible after the event or activity and must be available for the intended use within a reasonable time. Data must be available quickly and frequently enough to support information needs and to influence the appropriate level of service or management decisions.
10.3.6.2 Monitoring
This is the degree to which the trust can drill down into data to review and understand operational performance. The level to which the trust needs to drill down into the data will vary for different performance indicators. Some information should always be available at patient level for performance monitoring purposes. Whereas some information may be sufficient if it is available at speciality level for all specialties or even Trust level for performance monitoring purposes.
10.3.6.3 Completeness
There are two aspects to completeness. This is the extent to which all the expected attributes of the data are populated but also the extent to which all the records for the relevant population are provided.
10.3.6.4 Validation
This is the extent to which the data has been validated to ensure it is accurate and in compliance with relevant requirements. For example, correct application of rules and definitions. The level of validation required will vary from indicator-to-indicator and will depend on the level of data quality risk. Final validation is classified as sufficient where validation has been completed and where the indicator has received final approval from responsible individuals.
10.3.6.5 Accuracy
Data accuracy is the level to which data represents the real-world scenario or events it describes. Accuracy means that the data can be trusted to provide reliable insights and support informed decision-making processes. This data quality dimension should be monitored regularly, as it is the most likely to change over time. Monitoring data accuracy ensures that any changes are identified with the opportunity to correct them.
10.3.6.6 Accessibility
Data accessibility refers to the ease with which users can access, process and understand data within an organisation or system without unnecessary barriers. Accessibility is more than just having the data; it encompasses the processes and tools that make the recording of the data possible and the subsequent retrieval of interpretable and usable data.
10.3.7 Scope
Date quality kite mark application is aligned with the trust’s integrated quality performance report (IQPR) but is not limited to this. There are occasions where a key performance indicator not included within the integrated quality performance report will be assessed and a kite mark applied.
10.3.8 Roles and responsibilities
Whilst it is recognised that data quality is the responsibility of everyone in the trust, the responsibilities in relation to the application of data quality kite marks sits within the Information Quality team, which is part of the wider health informatics portfolio.
- Director of health informatics has overall accountability for the implementation and ongoing management of data quality kite marks.
- Finance Digital and Estates Committee (FDE) agrees and oversees the kite marking programme.
- Head of information quality is responsible for putting systems and tools in place to manage the application of data quality kite marks.
- Information quality officers are responsible for undertaking data quality reviews and the application of kite marks.
- Colleagues in other corporate functions and, or care groups are expected to support the kite marking process by responding to queries from the Information Quality team.
10.3.9 Process
This is set out via a process map. Please see process map (staff access only).
10.3.10 Frequency
The overarching aim is to review every key performance indicator included in the integrated quality performance report (IQPR). At the beginning of each financial year, the priority areas are determined and the 12-month schedule for the kite marking programme is agreed at this point. Sometimes there is a need to adjust the programme mid-year to accommodate other in-year priority areas, changes in system configuration, changes to national or local guidance and, or changes to source report logic. Some key performance indicators will be scheduled for a repeat review where the overall rating was “limited assurance” and an improvement plan has been initiated.
As and when the ambition to kite mark all key performance indicators included in the integrated quality performance report is achieved, the programme will repeat.
10.3.11 Review
Kite marking remit and methodology will be reviewed every three years as a minimum.
Document control
- Version: 9.
- Unique reference number: 286.
- Ratified by: digital transformation group.
- Date ratified: 14 October 2025.
- Name of originator or author: information quality manager.
- Name of responsible committee or individual: director of health informatics.
- Date issued: 12 November 2025.
- Review date: 31 October 2028.
- Target audience: operational management, clinical, medical and administrative support staff and all trust staff who are responsible for the collection and storage of data and information.
Page last reviewed: November 12, 2025
Next review due: November 12, 2026
Problem with this page?
Please tell us about any problems you have found with this web page.
Report a problem