AbstractObjectiveTo implement a systematic and uniform approach to evaluatingdata sources for syndromic surveillance within the United StatesDepartment of Agriculture (USDA) Animal and Plant HealthInspection Services (APHIS) Veterinary Services (VS) group.IntroductionUSDA-APHIS-VS utilizes several continuous data streams toincrease our knowledge of animal health and provide situationalawareness of emerging animal health issues. In addition, USDA-APHIS-VS often conducts pilot projects to see if regular data accessand analysis are feasible, and if so, if the information generated isuseful. Syndromic surveillance was developed for three goals: asyndromic monitoring system to identify new diseases, as an emergingdisease early warning system, and to provide situational awarenessof animal health status. Current efforts focus on monitoring diversedata, such as laboratory accessions or poison center calls, groupedinto syndromic or other health indicator categories, and are notintended to identify specific pre-determined diseases or pathogens.It is essential to regularly evaluate and re-evaluate the effectiveness ofour surveillance program. However, there are difficulties when usingtraditional surveillance evaluation methods, since the objectives andoutcomes of monitoring novel data streams from pilot projects arenot easily measurable. An additional challenge in the evaluation ofthese data streams is the identification of a method that can adapt tovarious context and inputs to make objective decisions. Until recently,assessment efforts have looked at the feasibility of regular analysisand reporting, but not at the utility of the information generated, northe plausibility and sustainability of longer term or expanded efforts.MethodsMethods for surveillance evaluation, syndromic surveillanceevaluation, and specifically for animal health syndromic surveillanceevaluation were researched via a literature review, exploration ofmethods used in-house on traditional surveillance systems, andthrough development over time of criteria that were seen as key tothe development of functioning, sustainable systems focusing onanimal health syndromic surveillance. Several methods were adaptedto create an approach that could organize information in a logicalmanner, clarify objectives, and make qualitative value assessmentsin situations where the quantitative aspects of costs and benefits werenot always straight forward. More than 25 articles were reviewed todetermine the best method of evaluation.ResultsThe RISKSUR Evaluation Support Tool (EVA) provided themajority of the methodology for the evaluations of our data sources.The EVA tool allows for an integrated approach for evaluation, andflexible methods to measure effectiveness and benefits of various datastreams. The most useful and common factors found to evaluate pilotdata sources of interest were how well the information generated bythe data streams could provide early detection of animal health events,and how well and how often situational awareness information onanimal health was generated. The EVA tool also helps identify andorganize criteria that are used to assess the objectives, and assignvalue.ConclusionsThe regular evaluation of syndromic surveillance data streamsin animal health is necessary to make best use of resources andmaximize benefits of data stream use. It is also useful to conductregular interim assessments on data streams in pilot phase to becertain key information for a final evaluation will be generated duringthe project. The RISKSUR EVA tool was found to be very flexibleand useful for allowing estimates of value to be made, even whenevaluating systems that do not have very specific, quantitativelymeasurable objectives. This tool provides flexibility in the selectionof attributes for evaluation, making it particularly useful whenexamining pilot project data streams. In combination with additionalreview methodologies from the literature review, a systematic anduniform approach to data stream evaluation was identified for futureuse.
Authors own copyright of their articles appearing in the Online Journal of Public Health Informatics. Readers may copy articles without permission of the copyright owner(s), as long as the author and OJPHI are acknowledged in the copy and the copy is used for educational, not-for-profit purposes. Share-alike: when posting copies or adaptations of the work, release the work under the same license as the original. For any other use of articles, please contact the copyright owner. The journal/publisher is not responsible for subsequent uses of the work, including uses infringing the above license. It is the author's responsibility to bring an infringement action if so desired by the author.