AbstractObjectiveTo explore the quality of data submitted once a facility is movedinto an ongoing submission status and address the importance ofcontinuing data quality assessments.IntroductionOnce a facility meets data quality standards and is approved forproduction, an assumption is made that the quality of data receivedremains at the same level. When looking at production data qualityreports from various states generated using a SAS data qualityprogram, a need for production data quality assessment was identified.By implementing a periodic data quality update on all productionfacilities, data quality has improved for production data as a whole andfor individual facility data. Through this activity several root causesof data quality degradation have been identified, allowing processesto be implemented in order to mitigate impact on data quality.MethodsMany jurisdictions work with facilities during the onboardingprocess to improve data quality. Once a certain level of data qualityis achieved, the facility is moved into production. At this point thejurisdiction generally assumes that the quality of the data beingsubmitted will remain fairly constant. To check this assumption inKansas, a SAS Production Report program was developed specificallyto look at production data quality.A legacy data set is downloaded from BioSense production serversby Earliest Date in order to capture all records for visits which occurredwithin a specified time frame. This data set is then run through a SASdata quality program which checks specific fields for completenessand validity and prints a report on counts and percentages of null andinvalid values, outdated records, and timeliness of record submission,as well as examples of records from visits containing these errors.A report is created for the state as a whole, each facility, EHR vendor,and HIE sending data to the production servers, with examplesprovided only by facility. The facility, vendor, and HIE reportsinclude state percentages of errors for comparison.The Production Report was initially run on Kansas data for thefirst quarter of 2016 followed by consultations with facilities on thefindings. Monthly checks were made of data quality before and afterfacilities implemented changes. An examination of Kansas’ resultsshowed a marked decrease in data quality for many facilities. Everyfacility had at least one area in need of improvement.The data quality reports and examples were sent to every facilitysending production data during the first quarter attached to an emailrequesting a 30-60 minute call with each to go over the report. Thiscall was deemed crucial to the process since it had been over a year,and in a few cases over two years, since some of the facilities hadlooked at data quality and would need a review of the findings andall requirements, new and old. Ultimately, over half of all productionfacilities scheduled a follow-up call.While some facilities expressed some degree of trepidation, mostfacilities were open to revisiting data quality and to making requestedimprovements. Reasons for data quality degradation included updatesto EHR products, change of EHR product, work flow issues, engineupdates, new requirements, and personnel turnover.A request was made of other jurisdictions (including Arizona,Nevada, and Illinois) to look at their production data using the sameprogram and compare quality. Data was pulled for at least one weekof July 2016 by Earliest Date.ResultsMonthly reports have been run on Kansas Production data bothbefore and after the consultation meetings which indicate a markedimprovement in both completeness of required fields and validityof values in those fields. Data for these monthly reports was againselected by Earliest Date.ConclusionsIn order to ensure production data continues to be of value forsyndromic surveillance purposes, periodic data quality assessmentsshould continue after a facility reaches ongoing submission status.Alterations in process include a review of production data at leasttwice per year with a follow up data review one month later to confirmadjustments have been correctly implemented.
Authors own copyright of their articles appearing in the Online Journal of Public Health Informatics. Readers may copy articles without permission of the copyright owner(s), as long as the author and OJPHI are acknowledged in the copy and the copy is used for educational, not-for-profit purposes. Share-alike: when posting copies or adaptations of the work, release the work under the same license as the original. For any other use of articles, please contact the copyright owner. The journal/publisher is not responsible for subsequent uses of the work, including uses infringing the above license. It is the author's responsibility to bring an infringement action if so desired by the author.