CREATE & MANAGE DATA
FORMATTING YOUR DATA
QUALITY ASSURANCE
Quality control of data is an integral part of all research and takes place at various stages, during data collection, data entry or digitisation, and data checking. It is vital to develop suitable procedures before data gathering starts.
Data collection
During data collection researchers must ensure that the data recorded reflect the actual facts, responses, observations and events. The quality of data collection methods used strongly influences data quality, and documenting in detail how data are collected provides evidence of such quality.
Quality control measures during data collection can include:
- calibration of instruments to check the precision, bias and/or scale of measurement
- taking multiple measurements, observations or samples
- checking the truth of the record with an expert
- using standardised methods and protocols for capturing observations, alongside recording forms with clear instructions
- computer-assisted interview software to: standardise interviews, verify response consistency, route and customise questions so that only appropriate questions are asked, confirm responses against previous answers where appropriate and detect inadmissible responses
Digitisation and data entry
When data are digitised, transcribed, entered in a database or spreadsheet, or coded, quality is ensured by standardised and consistent procedures for data entry with clear instructions. This may include:
- setting up validation rules or input masks in data entry software
- using data entry screens
- using controlled vocabularies, code lists and choice lists to minimise manual data entry
- detailed labelling of variable and record names to avoid confusion
- designing a purpose-built database structure to organise data and data files
- accompanying notes and documentation about the data
Data checking
Data checking is when data are edited, cleaned, verified, cross-checked and validated. Checking typically involves both automated and manual procedures:
- double-checking coding of observations or responses and out-of-range values
- checking data completeness
- adding variable and value labels where appropriate
- verifying random samples of the digital data against the original data
- double entry of data
- statistical analyses such as frequencies, means, ranges or clustering to detect errors and anomalous values
- correcting errors made during transcription
- peer review
UK Data Archive quality control
At the Archive we apply various quality control checks to all research data whilst we process them for archiving into the data collection. The level of quality control depends on how much additional value is to be added to the data, based on anticipated future usage.
VIEW MORE ON UK DATA ARCHIVE QUALITY CONTROL PROCEDURES
Adding value
Researchers can add significant value to their datasets by including additional variables or parameters that widen the possible applications. Including standard parameters or generic derived variables in data files may substantially increase the potential re-use value of a dataset and provide new avenues for research. For example, geo-referencing data may allow other researchers to more easily add value to data and apply the data in geographical information systems. Equally, sharing field notes from an interviewing project can help enrich the research context.
Qualitative data: recorded interviews
The quality of interview data gathered by means of recorded
interviews depends on both the skill of the interviewer and the
quality of the audio-visual equipment.
READ MORE ABOUT AUDIO-VISUAL RECORDING
EQUIPMENT