AbstractObjectiveTo describe the evaluation process to assess data quality duringdevelopment of an electronic case report application, and to describethe evaluation resultsIntroductionElectronic case reporting (eCR) is defined as the fully or semi-automated generation and electronic transmission of reportabledisease case reports from an electronic health record (EHR) systemto public health authorities, replacing the historically paper-basedprocess1. ECR has been reported to increase the number, accuracy,completeness and timeliness of surveillance case reports2. ChicagoDepartment of Public Health (CDPH) collaborated with Allianceof Chicago (AOC) to develop an application to generate electronicprovider reports (ePR) for chlamydia (CT) and gonorrhea (GC) casesfrom the EHR system managed by AOC and send ePR records to theIllinois National Electronic Disease Surveillance System (I-NEDSS).This application was tested in the EHR database of Health Center A inAOC’s network. It is essential to ensure ePR data are accurate, so thatpublic health receives correct information to take actions if needed.Therefore, evaluation is needed to assess ePR records data quality.MethodsCDPH developed a five step evaluation plan to validate ePRrecords data quality. Step 1 was to validate the ePR file format toensure all I-NEDSS required fields are present, required value setswere used, and file format did not vary across files generated. Step 2was to validate the algorithm accuracy. Chart review was conductedto ensure the ePR records do not include non-reportable cases. Step 3was to review ePR records loaded in I-NEDSS to make sure all valuesin ePR raw files appeared correctly on the I-NEDSS front end.After the application passed steps 1 to 3, it moved to step 4, parallelvalidation. The first phase of parallel validation was to review historiccases. Test ePR records for CT and GC cases diagnosed by HealthCenter A in 2015 (n=510) were compared to the same 510 cases’closed surveillance case reports in I-NEDSS. The completeness oftreatment, race, and ethnicity was examined. The application thenmoved into testing daily data feed. Daily ePR records were comparedwith EHR charts and paper provider reports received by CDPHto assess completeness and timeliness. Step 5 was to re-evaluatealgorithms. EPR records were validated against the electroniclaboratory reports (ELR) records, which were used as gold standardsof all reportable CT and GC cases, to find missing cases.ResultsThe first three steps of evaluation occurred from January to April2016. Test ePR files containing historic cases from Health CenterA were vetted weekly. A total of 14 test ePR files were reviewed.This process identified required fields not present (patient address,treatment date, treatment, and race), race value sets not returnedcorrectly, and additional logic statements needed to return correctpregnancy status at the time of diagnosis. These issues were discussedwith the project team, and the application was modified accordingly.The historic case review found ePR data were more complete thanclosed surveillance reports. Compared to closed surveillance reportsin I-NEDSS, 18% (94/510) of the cases had incomplete treatmentinformation in the ePR records compared to 78% (400/510), 0.2%(1/510) of the cases did not have race information in the ePR recordscompared to 47% (240/510), and 0.7% (4/510) of the cases had noethnicity information in the ePR records compared to 50% (253/510).These preliminary evaluation results suggest that eCR improvessurveillance case reports data quality. The daily data feed data qualityevaluation is still on-going, and ePR data quality will be monitoredcontinuously.ConclusionsEvaluation plays an integral role in developing and implementingthe eCR process in Chicago. The stepwise evaluation process ensuresePR data quality meeting public health requirements, so that publichealth will be able to act on more complete information to improvepopulation health.
Authors own copyright of their articles appearing in the Online Journal of Public Health Informatics. Readers may copy articles without permission of the copyright owner(s), as long as the author and OJPHI are acknowledged in the copy and the copy is used for educational, not-for-profit purposes. Share-alike: when posting copies or adaptations of the work, release the work under the same license as the original. For any other use of articles, please contact the copyright owner. The journal/publisher is not responsible for subsequent uses of the work, including uses infringing the above license. It is the author's responsibility to bring an infringement action if so desired by the author.