Quality management approach
This information sets out the principles and procedures that we follow to manage the quality of the data we use and present in our statistical publications.
We produce official statistics in line with the Code of Practice for Official Statistics 3.0 (‘Code 3.0’ or ‘the code’).
Standard 5, Practice 8 of the Code states: ‘Publish your quality management approach and explain how it aligns with your organisation’s commitment to data quality.’
Assessing data quality
We assess how we manage the quality of our statistical data against the Data Management Association (DAMA) International’s six core dimensions of data quality: accuracy, completeness, consistency, timeliness, uniqueness and validity.
Accuracy
The accuracy of the data we process reflects both:
-
the accuracy of the data we capture, enter and hold in our business systems
-
the accuracy of the data shared by external bodies at a particular point in time.
We explain the limitations of the data sources we use to produce our official statistics in accompanying background webpages.
We carry out data processing checks that can help identify inaccurate records - for example, a date of birth that indicates a candidate is younger than five years old. While these instances are rare, when they happen, we ask our data steward to report the records to our data systems teams, and check them ourselves before we process the data any further and include it in our statistical datasets.
In our Confidentiality policy, we explain how we protect individuals’ information while still outputting statistical information that is as close as possible to actual values.
Completeness
We produce our statistics on internally held data, which we make sure is as near complete as possible. For example, we produce attainment statistics on results day data in August, and revised attainment statistics on the revised post-appeals data in December.
We define the datasets we use for our official statistics in each corresponding background webpage and we highlight to users any changes to datasets that significantly affect the quality dimension of our publications. We also detail how we pre-process and process our data.
Consistency
We extract data at similar points in the year to ensure consistency across years. Efficient data processing helps us to check the consistency of this data before we include it in our official statistics.
Due to the ever-changing nature of qualifications, it’s impossible for our statistical time series data to be consistent within the same release over time. We include information on our data’s coherence and compatability in background webpages, where necessary.
Timeliness
We produce our statistics as quickly as possible using reproducible analytical pipelines (RAP). We look at the time-frame between data availability, processing, pre-release (where appropriate), and publication every year. If the team and the head of profession agrees, we adjust our schedule to produce certain statistics earlier in the year.
Uniqueness
Processing using appropriate software means we can check for uniqueness, and identify and remove duplicate records during processing.
Validity
Data is valid if it conforms to the format, type and range of its definition. We carry out quality assurance during the pre-processing and processing stages of production to make sure it is valid.
We state limitations related to the quality of official statistics in the accompanying background webpages.
Additional quality management measures
Reproducible analytical pipelines (RAP)
We apply RAP principles across all of our productions. Using version control software means we can store the code for specific publications independently, and trace the changes made over time. This includes code we use to create the initial dataset, as well as code we use for data processing, quality assurance, and producing statistical summaries and outputs.
Rotating of analytical staff
We rotate the analytical staff who work on our statistical publications every year. This helps us to improve and innovate, and ensures continuous quality assurance of data and statistics.
Quality assurance
When we update code - for example, when we produce additional statistical summary information or tables - a team member not involved in production or development independently reviews the changes to the code base and the statistics produced.
Compliance
The head of profession (our head of data and analytics) has the final decision on any action necessary to comply with these procedures.
Questions or comments?
We want this information to be as helpful as possible. Please email us at data.analytics@qualifications.gov.scot with questions or general comments.
Our statistical practice is regulated by the Office for Statistics Regulation (OSR). OSR sets the standards of trustworthiness, quality and value in the Code. You can also email us at data.analytics@qualifications.gov.scot with your comments on how we’re meeting these standards, or you can contact OSR by email at regulation@statistics.gov.uk, or through the OSR website.