OJPHI: Vol. 5
Journal Information
Journal ID (publisher-id): OJPHI
ISSN: 1947-2579
Publisher: University of Illinois at Chicago Library
Article Information
©2013 the author(s)
open-access: This is an Open Access article. Authors own copyright of their articles appearing in the Online Journal of Public Health Informatics. Readers may copy articles without permission of the copyright owner(s), as long as the author and OJPHI are acknowledged in the copy and the copy is used for educational, not-for-profit purposes.
Electronic publication date: Day: 4 Month: 4 Year: 2013
collection publication date: Year: 2013
Volume: 5E-location ID: e131
Publisher Id: ojphi-05-131

A Review of Evaluations of Electronic Event-based Biosurveillance Systems
Kimberly Gajewski*1
Jean-Paul Chretien2
Amy Peterson2
Julie Pavlin3
Rohit Chitale2
1Emory University, Atlanta, GA, USA;
2Division of Integrated Biosurveillance, Silver Spring, MD, USA;
3Headquarters, Armed Forces Health Surveillance Center, Silver Spring, MD, USA
*Kimberly Gajewski, E-mail: kimberly.gajewski@emory.edu

Abstract
Objective

To assess evaluations of electronic event-based biosurveillance systems (EEBS’s) and define priorities for EEBS evaluations.

Introduction

EEBS’s that use near real-time information from the Internet are an increasingly important source of intelligence for public health organizations (1, 2). However, there has not been a systematic assessment of EEBS evaluations, which could identify uncertainties about current systems and guide EEBS development to effectively exploit digital information for surveillance.

Methods

We searched PubMed and consulted EEBS experts to identify EEBS’s that met the following criteria: uses publicly-available Internet info sources, includes events that impact humans, and has global scope. We constructed a list of 17 key evaluation variables using guidelines for evaluating health surveillance systems, and identified the key variables included in evaluations per EEBS, as well as the number of EEBS’s evaluated for each key variable (3,4).

Results

We identified 10 EEBS’s and 17 evaluations (Table 1). The number of evaluations per EEBS ranged from 1 (Gen-Db, GODsN) to 7 (GPHIN, HealthMap). The median number of variables assessed per EEBS was 6 (range, 3–12), with 5 (25%) evaluations assessing 7+ variables. Nine (53%) published evaluations contained quantitative assessments of at least 1 variable. The least-frequently studied variable was cost. No papers examined usefulness as specific public health decisions or outcomes resulting from early event detection, though 8 evaluations assessed usefulness by citing instances where the EEBS detected an outbreak earlier, or by eliciting user feedback.

Conclusions

While EEBS’s have demonstrated their usefulness and accuracy for early outbreak detection, no evaluations have cited specific examples of public health decisions or outcomes resulting from the EEBS. Future evaluations should discuss these critical indicators of public health utility. They also should assess the novel aspects of EEBS and include variables such as policy readiness, system redundancy, input/output geography (5); and test the effects of combining EEBS’s into a “super system”.


References
Heymann DL, et al. Hot spots in a wired world: WHO surveillance of emerging and re-emerging infectious diseasesLancet Infect Dis 2001;1:345–53.
Keller M, et al. Use of unstructured event-based reports for global infectious disease surveillanceEmerg Infect Dis 2009;15:689–95.
German RR, et al. Guidelines working group centers for disease control and prevention (CDC).Updated guidelines for evaluating public health surveillance systems: Recommendations from the Guidelines Working GroupMMWR Recomm Rep 2001;50(RR-13):1–35.
Buehler JW, et al. Framework for evaluating public health surveillance systems for early detection of outbreaks: Recommendations from the CDC working groupMMWR Recomm Rep 2004;53(RR-5):1–11.
Corley CD, et al. Assessing the continuum of event-based biosurveillance through an operational lensBiosecur Bioterror 2012;10:131–141.

Tables
[TableWrap ID: t1-ojphi-05-131] Table 1 

Number of published evaluations and variables on identified EEBS’s


EEBS Year started No. evaluations No. key variables assessed
Argus 2005 5 7
BioCaster 2006 5 9
EpiSpider 2006 2 4
Gcni-Db 2012 1 4
GODSn 2006 1 3
GPHIN 1997 7 10
Health Map 2006 7 12
MedlSys 2006 2 4
ProMed 1994 5 12
PULS 2006 2 5

[TableWrap ID: t2-ojphi-05-131] Table 2 

Key variables used in evaluations of EEBS


Key eval var Ret. No. evals using the var No. of EHBS's evaluated on this var
Acceptability 3,4 6 4
Accessibility 4 5 4
Cost 4 3 2
Data quality 3,4 5 3
Flexibility 3,4 2 3
Population coverage 4 9 5
Predict. value pos. 3 5 4
Purpose 4 15 10
Portability 4 2 2
Representativeness 3,4 5 5
Resources 3,4 7 3
Sensitivity 3 9 5
Simplicity 3 5 4
Stability 3,4 0 0
Timeliness 3,4 14 9
Usefulness 3,4 8 7
Validity 4 5 4


Article Categories:
  • ISDS 2012 Conference Abstracts

Keywords: evaluation, biosurveillance, event-based surveillance.




Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org