Keywords

agency capacity, community-based organizations, evidence-based interventions, HIV prevention

 

Authors

  1. Collins, Charles PhD
  2. Phields, Miriam E. PhD
  3. Duncan, Ted PhD
  4. the Science Application Team

Abstract

The Centers for Disease Control and Prevention (CDC) implemented the Diffusion of Effective Behavioral Interventions Project to disseminate evidence-based behavioral interventions to community-based HIV prevention providers. Through development of intervention-specific technical assistance guides and provision of face-to-face, telephone, and e-mail technical assistance, a range of capacity-building issues were identified. These issues were linked to a proposed agency capacity model for implementing an evidence-based intervention. The model has six domains: organizational environment, governance, and programmatic infrastructure; workforce and professional development; resources and support; motivational forces and readiness; learning from experience; and adjusting to the external environment. We think this model could be used to implement evidence-based interventions by facilitating the selection of best-prepared agencies and by identifying critical areas of capacity building. The model will help us establish a framework for informing future program announcements and predecisional site visit assessments, and in developing an instrument for assessing agency capacity to implement evidence-based interventions.

 

Article Content

A National Institutes of Health scientific review panel found the research evidence supporting the efficacy of behavioral interventions to be so strong that it recommended that community-based prevention providers use evidence-based interventions with their clients.1 The Institute of Medicine urged the Centers for Disease Control and Prevention (CDC) to translate evidence-based behavioral prevention research findings into community HIV prevention practice.2 In response, the CDC funded programs to implement specific evidence-based strategies.3 The CDC also implemented an initiative to diffuse evidence-based HIV interventions to community-based organizations (CBOs).4 Before the CDC's recent initiatives, there was little evidence that these evidence-based interventions were being implemented by CBOs,5,6 the primary providers of behavioral interventions for HIV prevention in the United States.7 Thus, enhancing agency capacity to provide evidence-based behavioral prevention interventions is an essential component of HIV prevention programs in the United States.

 

Robin Miller8 reviewed the literature on factors that affect the process of technology transfer of evidence-based practices9-12 and examined whether the evidence-based interventions are then used in community settings.13,14 She identifies four critical factors that affect the technology transfer process: (1) innovation characteristics, (2) relationships and communication between key stakeholders, (3) characteristics of the organizational entities into which technologies are to be disseminated, and (4) settings in which adopters are located. In this article, we explore the third factor, organizational characteristics, as we work toward developing a model for agency capacity that can be used to implement evidence-based interventions and by which agency capacity might be assessed.

 

To improve the field of prevention practice, we must ensure that capacity-building strategies and activities are integral parts of efforts to disseminate evidence-based interventions (EBIs). We see this as occurring in two ways. First, agencies with high capacity are funded to implement EBIs, thereby increasing the likelihood that the EBIs are appropriately implemented and that desired outcomes are achieved. Second, as agencies implement EBIs, they engage in capacity-building activities that increase the probability of enhanced HIV prevention performance.

 

For purposes of this discussion, we limit the topic to common domains of agency capacity for implementing evidence-based HIV prevention behavioral interventions. Our model posits that specific capacities to implement evidence-based interventions are a subset of a more general agency capacity.

 

Hawe and colleagues build a case for developing capacity measures that help assess agency capacity.15 We intended our model to be both predictive and prescriptive. A predictive model helps us identify those agencies with high capacity that are best suited to implement evidence-based interventions. A prescriptive model helps us identify areas of improvement within an agency that need to be addressed, thus increasing the likelihood that the evidence-based intervention will be implemented appropriately. However, for the model to be both predictive and prescriptive, an instrument for measuring agency capacity to implement an EBI must be developed. We take the first step in developing such an instrument by identifying the domains of agency capacity needed to implement an EBI.

 

Agency Capacity for Implementing an Evidence-based Intervention

Even though the role of capacity building has received increased attention, there is limited understanding of the actual role that capacity plays in the successful delivery of an evidence-based intervention.16 Goodman and colleagues assert that capacity exists for the purpose of performing a certain action or enabling performance.17 They describe capacity as the ability to carry out clearly articulated objectives.17 LaFond and colleagues find the capacity-building literature to have "limited discussion of how to measure capacity prior to an intervention."16(p4) LaFond and colleagues also conclude that "efforts to measure the outcomes of capacity building are at the very early stages of development."16(p5)

 

The CDC defines capacity building as technical assistance, training, information sharing, technology transfer, materials development, or funding that enables an organization to better serve customers or to operate in a more comprehensive, responsive, and effective manner.18 Although technical assistance and financial resources are critical for improving agency capacity, technical assistance and funding alone are generally not sufficient to help agencies manage the agency growth that occurs when implementing an evidence-based intervention.19,20

 

The elements of agency capacity most frequently address structures, processes, and management systems that enable an organization to function effectively and adapt to changing circumstances.16 It typically includes human, physical, and knowledge resources of the agency and the processes employed to transform these resources into services. Specific individual capacities relevant to agency performance include strategic planning, management of finances, information management, communications, and human resource development and management.16

 

There is no consensus in the research literature on the specific elements of capacity and how these elements might be linked to successfully implementing an EBI. Neither is there consensus on the level of capacity required to successfully implement an EBI. This lack of consensus may be a consequence of the lack of models that could be used to articulate the capacity required to implement an evidence-based HIV prevention intervention and the methods by which this agency capacity may be measured.

 

Miller offers one of the first reviews of the agency capacity or conditions necessary to adopt an "externally-developed program" for HIV prevention.8 She examines research that has identified characteristics of organizations, such as agency hierarchy, centralized decision making, stability, and maturity, that would facilitate adoption of an externally developed program.10,11,14,21,22 In her review, Miller finds that some research suggested that agencies that adopt externally developed programs have "complex and well-defined subsystems, are large, and are well financed"8(p641) and tend to be more stable and mature.10,13,14 Miller summarizes that "adopting externally-developed programs requires that an organization have the financial means to do so, as well as the personnel and management infrastructure to implement the program."8(p643)

 

Miller8 points out that the literature on CBOs in the United States indicates that their infrastructures may be fragile and they may be resource-poor.6,23 These findings support the need for CDC to address capacity weaknesses within funded agencies for the purpose of improved service delivery and efficient use of resources.

 

We define agency capacity to implement evidence-based behavioral interventions as the efficient use of human, physical, and knowledge resources and the processes employed to transform these resources into services in order to deliver interventions by implementing programmatic objectives, addressing risk behavior determinants, and achieving risk reduction outcomes.

 

Model Development

Hawe and colleagues combine the capacity-building models of eight research and academic groups into one capacity-building model with three operational levels.15 In the first level of their model, health infrastructure and service development, the focus is on building the capacity to deliver particular interventions to address particular outcomes. Our goal in this article is to show how a similar level 1 model can be developed to ensure that agency capacity is sufficient to deliver an evidence-based intervention.

 

Our agency capacity model was initially derived from an evaluation capacity-building model developed by Milstein and Cotton.24 The Milstein and Cotton model has five domains: organizational, workforce development, resources, motivation, and learning from experience. We modified the model at the suggestion of Deborah McGill (oral communication, July 2004), who suggested a sixth domain, "adjusting to the external environment." In addition, in our review of the literature on agency capacity, we found multiple mentions of governance as an essential part of the organizational environment. Our literature review also identified the concept of "agency readiness" as a factor, which we saw as a mediator to action. We thus modified Milstein and Cotton's "motivation" domain to "motivational forces and agency readiness," and we modified their "organizational environment" domain to "organizational environment, governance, and programmatic infrastructure."

 

Wandersman and colleagues have developed a framework for organizational capacities in their "getting to outcomes" approach.25,26 Their domains include human capacities, technical capacities, fiscal capacities, and structural/formal linkage capacities. The "getting to outcomes" approach also informed our model. Our model also draws on diverse disciplines, including theories of organizational development and organizational change, community psychology, public administration, systems theory, and behavioral science17,27-30 and select literature on the operations of community-based organizations that provide HIV prevention services.7,8,23,31,32 A literature review helped us further modify and inform our model.

 

Because of the CDC's emphasis on evidence-based practice, we refined the model to apply to agency capacity for implementing an evidence-based behavioral intervention. We saw agency capacity for implementing an evidence-based intervention as having several domains, all of which might be developed and strengthened through the capacity-building process. Essentially, we viewed capacity building for the agencies as being related to building infrastructure and refining processes to deliver relatively complex behavioral interventions; by using the model, we wished to strengthen service development and delivery.

 

Simultaneous to the development of the capacity-building model, 12 interventions were disseminated through the Diffusion of Effective Behavioral Interventions Project. Over 44 months (1 January 2003 to 31 August 2006), these evidence-based interventions were diffused to 1,992 CBOs through 353 intervention trainings; a total of 4,557 CBO employees were trained. Twelve technical assistance guides were developed, one for each behavioral intervention. We were able to use these technical assistance guides, which provided many capacity-building examples, to help modify and inform our model.

 

We initiated a capacity-building assistance request and information system (CRIS) Web site through which our grantees could obtain capacity-building information and technical assistance (TA) for implementing their evidence-based interventions. Over an 11-month period, a total of 335 technical assistance requests were entered into the database and divided into the two topics of Strengthening Organizational Infrastructure for HIV Prevention (106 requests) and Strengthening Interventions for HIV Prevention (229 requests). By analyzing the TA requests in the CRIS database, we were able to further inform our model development.

 

Through development of TA guides and provision of face-to-face, telephone, and e-mail TA, a range of capacity-building issues were addressed. Using the collective experience of our CBA and other TA providers and the direct experience of CDC staff providing TA and CBA, we identified a broad range of capacity-building issues that are relevant to the implementation of an EBI. These multiple issues were then grouped into the six categories/domains of our proposed model.

 

A model for agency capacity required to implement evidence-based behavioral interventions was developed with six domains: (1) organizational environment, governance, and programmatic infrastructure (the agency culture and characteristics that are needed to support an EBI); (2) workforce and professional development (knowledge, skills, and abilities of the people who implement an EBI); (3) resources and support (practical assets needed to implement an EBI); (4) motivational forces and readiness (reasons that an EBI will meet the needs of the target population as well as staff enthusiasm to provide high-quality services); (5) learning from experience (changes that occur in people, agencies, and interventions during implementation as provided by feedback and evaluation); and (6) adjusting to the external environment (adjustments to the cultural, social, economic, political, legal, and environmental factors that are within agency control and influence their capacity to implement an EBI). Figure 1 shows the relationships between the domains.

  
Figure 1 - Click to enlarge in new windowFIGURE 1. Agency capacity to implement evidence-based HIV prevention practice.

Each of these six domains is defined and briefly described below.

 

Organizational Environment, Governance, and Programmatic Infrastructure

Leadership at multiple levels of the agency29,30 and governance33,34 are primary factors in the organizational environment. Agency infrastructure and adherence to the agency vision and mission are two additional factors within the organizational environment. Primary indicators include management of funds and resources, communication patterns and efficiency, and commitment to the populations/communities served by the agency.

 

Workforce and Professional Development

This domain focuses on both human capital and technical capacities.26 To better ensure full implementation, programs are encouraged to have adequate numbers of personnel devoted to implementing an EBI. Agencies need staff with appropriate credentials and related experience. Not only are more staff usually required to implement an EBI, but these staff must be trained on the specifics of implementing the EBI; they must also maintain this expertise through professional development and supportive supervision. Recruitment of staff who have already implemented an evidence-based intervention is one way that community-based organizations can quickly acquire capacity. An agency with managers or staff who have become familiar with a particular EBI or a similar one have greater workforce capacity to implement the intervention. These employees have experiences that allow them to visualize the operation of the EBI, explain it to fellow employees, and thus be mentors or spokespersons for the EBI at the agency. Agency management must also understand the fiscal and developmental resources needed to successfully implement the intervention.

 

Resources and Support

All evidence-based interventions for HIV prevention depend on adequate resources for appropriate planning, implementation, and evaluation. Management of resources includes having budgeting procedures in place that ensure services remain constant.

 

Motivational Forces and Readiness

Enthusiasm for an EBI is required for implementation of an EBI. Because the EBI frequently constitutes a new practice, enthusiasm must be generated to overcome the inertia of institutionalized agency practice. The enthusiasm to implement an EBI, however, must reside at several organizational levels, including the board of directors, the executive director of the agency, program directors/coordinators, and frontline staff. A broader picture also includes enthusiasm from agency clients and community readiness.26

 

External motivators for use of an EBI include using the EBI because protocols and other intervention materials have already been developed. Another motivator is that the funding agency requires the implementation of an EBI; by implementing the EBI, the funded agency better positions itself to continue to receive operating funds.

 

Learning From Experience

The learning from experience component of our model was informed by models of organizational learning and participatory organizational development.35-37 Organizational development models36 and learning organization models35,37 emphasize self-assessment and reflection approaches.

 

In a review of the "learning organization" literature, Hawe and colleagues15 find that learning organizations have the following characteristics: (1) an openness to new ideas; (2) an agency culture that encourages and provides opportunities for learning and innovation; and (3) widespread knowledge of the organization's goals and mission and understanding of how each person and each part of the organization contributes.38 These organizations have "visible leadership" that promote a broad range of skills among employees; with these broad skills, employees can not only function in their own job but also contribute to the entire organization.39

 

Keys developed a model whereby agency development is a continually active process during which many types of data and information are analyzed to determine agency and clientele needs, to modify goals and objectives as new information becomes available, and to make judgments about programmatic outcomes.36 Miller and colleagues point out that these models are used to implement agency self-assessment as an ongoing, continuous process during which agency operations and functioning are improved through systematic and planned self-study.40

 

In our model, learning from experience acts as a mediator between these other four domains and agency capacity. For example, we see a distinction between resources and wise use of those resources. Through the learning from experience process, an agency learns how to best use resources to achieve agency goals and outcomes.

 

Adjusting to the External Environment

Agencies require resources, primarily in funds, to continue to provide HIV prevention services to their target populations. Thus, a critical aspect of agency capacity to implement evidence-based interventions is the ability to identify funding priorities of various potential funding organizations and successfully apply for these funds if the grant objectives correspond to the agency's mission.

 

Managing the external environment means more than maintaining a funding source for agency activities. It also means managing environmental and contextual factors such as the cultural, social, economic, political, legal, and structural/environmental variables that affect capacity and performance. An agency's ability to respond to new information and adjust its programs accordingly is essential to maintaining stable service delivery.13,41-45

 

Competition between service delivery agencies sometimes develops in communities. Those agencies that can move toward collaboration strategies are more likely to manage their external environment well and increase their likelihood of future funding. Sharing resources and developing collaborative relationships may be critical to successful implementation of an EBI. Wandersman and colleagues identify four levels of linkage: networking for the exchange of information of mutual benefit; coordinating and adjusting activities and service delivery for mutual benefit; cooperating by sharing resources; and collaborating with formal sustained commitments.26

 

Next Steps

Four steps follow from the development of this model: (1) further refine the model to include specific capacity measures for specific EBIs; (2) use the indicators in future CDC program announcements and in the predecisional site visit process to identify and fund those agencies with the highest capacity to implement EBIs; (3) begin to link agency capacity to intervention performance standards as a measure of whether capacity has actually been achieved and whether this capacity is related to intervention outcomes; and (4) begin to provide appropriate capacity-building strategies to enhance EBI delivery. These next steps are discussed in more detail below.

 

1. To effectively implement capacity-building strategies, funding agencies must be able to accurately assess existing capacity and measure the effectiveness of capacity-building interventions.16 Our next step is to develop and field test instruments for assessing the elements and indicators in the model and then apply the model in the prevention field; the results of these applications are expected to provide information to refine the model further. We will use these results to develop more specific models to indicate the capacity required to implement specific evidence-based interventions.

 

We designed our model to make more explicit the assumptions underlying the elements of capacity and performance and to provide a framework for testing those assumptions. We viewed this as a necessary step toward increased understanding of the indicators of agency capacity needed to implement evidence-based interventions. We also viewed this as an opportunity to eventually develop an empirical instrument that might be used both to predict those agencies that will perform well at implementing an evidence-based intervention as well as to identify those areas that need improvement for an agency to perform at higher levels.

 

2. Eventually, the model will also be used in developing CDC program announcements to better identify high-capacity prevention providers during the objective review and funding process. In addition, the model offers a template for the development of an interview-driven assessment that project officers could use for prefunding decisional site visits to assess whether potential grantees have the agency capacity to implement an evidence-based intervention.

 

3. In the capacity-building literature, LaFond and colleagues find many examples of how to improve agency capacity, but little discussion of the level of agency performance expected from these improvements in capacity.16 Our model should link agency capacity with ability to implement an evidence-based intervention with fidelity. This linkage responds to LaFond and colleagues' assertion that criteria must be established to measure whether actual capacity building has occurred at the agency level. We have selected fidelity to the core elements of the evidence-based behavioral intervention as the first criterion for judging whether capacity has indeed been increased to result in achieving programmatic outcomes. In addition, levels of service delivery using the evidence-based intervention would also be tied to agency capacity.

 

4. Kotellos and colleagues suggest that scarce resources require agencies such as the CDC to pay closer attention to how the capacities of community-based organizations are built to ensure that CBOs make the best use of these resources in implementing evidence-based behavioral interventions.46 The model will eventually help CBA providers in their work building the capacity of prevention agencies. The model allows for focused capacity-building efforts dedicated to prevention providers as well as content areas for resource development at the city, county, state, and federal levels.

 

Conclusions

We present a conceptual model that might be used to measure agency capacity to implement an evidence-based intervention for HIV prevention. Our model provides insight on the critical gaps in agency capacity at the preimplementation phase of a project and a strategy for addressing deficits in agency capacity. Also, our model can be used to link these elements to performance, which may help in monitoring the successful implementation of evidence-based interventions. Practical application of the model will help advance the measurement of agency capacity to implement an evidence-based intervention and begin to clarify the relationship between capacity and performance.

 

The Science Application Team at CDC is composed of Jonny F. Andia, PhD; Charles Collins, PhD; Ted Duncan, PhD; Taleria R. Fuller, PhD; Camilla Harshbarger, PhD; Winifred King, PhD; John Mosier, PhD; Miriam E. Phields, PhD; Cynthia Prather, PhD; Tanya Telfair Sharpe, PhD; JoAna Stallworth, PhD; and David Whittier, PhD.

 

Jonny F. Andia, PhD, is Behavioral Scientist, Capacity Building Branch, Centers for Disease Control and Prevention, Atlanta, Georgia.

 

Tanya Telfair Sharpe, PhD, is Behavioral Scientist, Capacity Building Branch, Centers for Disease Control and Prevention, Atlanta, Georgia.

 

Taleria R. Fuller, PhD, is Fellow, Capacity Building Branch, Centers for Disease Control and Prevention, Atlanta, Georgia.

 

Camilla Harshbarger, PhD, is Behavioral Scientist, Capacity Building Branch, Centers for Disease Control and Prevention, Atlanta, Georgia.

 

Winifred King, PhD, is Behavioral Scientist, Capacity Building Branch, Centers for Disease Control and Prevention, Atlanta, Georgia.

 

John Mosier, PhD, is Research Behavioral Scientist, Capacity Building Branch, Centers for Disease Control and Prevention, Atlanta, Georgia.

 

Cynthia Prather, PhD, is Behavioral Scientist, Capacity Building Branch, Centers for Disease Control and Prevention, Atlanta, Georgia.

 

JoAna Stallworth, PhD, is Behavioral Scientist, Capacity Building Branch, Centers for Disease Control and Prevention, Atlanta, Georgia.

 

David Whittier, PhD, is Research Behavioral Scientist, Capacity Building Branch, Centers for Disease Control and Prevention, Atlanta, Georgia.

 

REFERENCES

 

1. National Institutes of Health. Interventions to prevent HIV risk behaviors. NIH Consens State. 1997;15(2):1-41. [Context Link]

 

2. Institute of Medicine. No Time to Lose: Getting More From HIV Prevention. Washington, DC: National Academy Press; 2001. [Context Link]

 

3. Department of Health and Human Services. Program announcement 04064-human immunodeficiency virus (HIV) prevention projects for community-based organizations (CBOs). Fed Regist. 2003;68(231):67566-67575. [Context Link]

 

4. Collins C, Harshbarger C, Sawyer R, Hamdallah M. The diffusion of effective behavioral interventions project: development, implementation, and lessons learned. AIDS Educ Prev. 2006;18(suppl A):5-20. [Context Link]

 

5. Somlai AM, Kelly JA, Otto-Salaj LL, et al. Current HIV prevention activities for women and gay men among 77 ASOs. J Public Health Manag Pract. 1998;5:23-33. [Context Link]

 

6. DiFranciesco W, Kelly JA, Otto-Salaj L, McAuliffe TL, Somlai AM, Hackl K. Factors influencing attitudes within AIDS organizations toward the use of research based HIV prevention interventions. AIDS Educ Prev. 1999;11:72-86. [Context Link]

 

7. Altman DG. Power and Community: Organizational and Cultural Responses to AIDS. Bristol, PA: Taylor and Francis; 1994. [Context Link]

 

8. Miller RL. Innovation in HIV prevention: organizational and intervention characteristics affecting program adoption. Am J Community Psychol. 2001;29(4):621-647. [Context Link]

 

9. Backer TE, David SL, Soucy G, eds. Reviewing the Behavioral and Science Knowledge Base on Technology Transfer. Rockville, MD: National Institute on Drug Abuse; 1995. [Context Link]

 

10. Baldridge JV, Burnham RA. Organizational innovation: individual, organizational, and environmental impacts. Adm Sci Q. 1975;20:165-176. [Context Link]

 

11. Rogers EM. Diffusion of Innovations. 4th ed. New York: Free Press; 1995. [Context Link]

 

12. Mayer JP, Davidson WS. The dissemination of innovation. In: Rappaport J, Seidman E, eds. The Handbook of Community Psychology. New York: Plenum Press; 2000:421-438. [Context Link]

 

13. Shediac-Rizkallah MC, Bone LR. Planning for sustainability of community-based health programs: conceptual frameworks and future directions for research practice and policy. Health Educ Res. 1998;13:87-108. [Context Link]

 

14. Steckler A, Goodman RM. How to institutionalize health promotion programs. Am J Health Promot. 1989;3:34-44. [Context Link]

 

15. Hawe P, Noort M, King L, Jordens C. Multiplying health gains: the critical role of capacity-building within health promotion programs. Health Policy. 1997;39:29-42. [Context Link]

 

16. LaFond AK, Brown L, Macintyre K. Mapping capacity in the health sector: a conceptual framework. Int J Health Plann Manag. 2002;17:3-22. [Context Link]

 

17. Goodman RM, Speers MA, McLeroy K, et al. Identifying and defining the dimensions of community capacity to provide a basis for measurement. Health Educ Behav. 1998;25:258-278. [Context Link]

 

18. Centers for Disease Control and Prevention. Building capacity-technology transfer efforts and sustainability for HIV prevention. National Center for HIV, STD, and TB Prevention Program Briefing; 2000. [Context Link]

 

19. Campbell P. Strengthening organizations. NGO Manag. 1990;18:21-24. [Context Link]

 

20. Marsden D, Oakley P, Prat B, eds. Measuring the Process: Guidelines for Evaluating Social Development. Oxford, UK: International NGO Training and Research Center; 1994. [Context Link]

 

21. Aiken M, Hage J. The organic organization and innovation. Sociology. 1971;5:563-582. [Context Link]

 

22. Hage J, Aiken M. Social Change in Complex Organization. New York: Random House; 1970. [Context Link]

 

23. Freudenberg N, Zimmerman MA. AIDS Prevention in the Community: Lessons from the First Decade. Washington, DC: American Public Health Association; 1995. [Context Link]

 

24. Milstein B, Cotton D. Defining concepts for the presidential strand on building evaluation capacity. Paper presented at the 2000 meeting of the American Evaluation Association: Evaluation 2000; November 2, 2000; Honolulu, HI. [Context Link]

 

25. Wandersman A, Imm P, Chinman M, Kaftarian S. Getting to Outcomes: Methods and Tools for Planning, Self-Evaluation, and Accountability. Rockville, MD: Center for Substance Abuse Prevention; 1999. [Context Link]

 

26. Wandersman A, Imm P, Chinman M, Kaftarian S. Getting to outcomes: a result-based approach to accountability. Eval Program Plann. 2000;23:389-395. [Context Link]

 

27. Beckhard R, Harris R. Organizational Transitions: Managing Complex Change. Reading, MA: Addison-Wesley Publishing Co; 1987. [Context Link]

 

28. Kim C, Whetten D, eds. Organizational Effectiveness: A Comparison of Multiple Models. New York: Academic Press; 1983. [Context Link]

 

29. Labonte R, Laverack G. Capacity building in health promotion, I: For whom? And for what purpose? Crit Public Health. 2001;11:111-127. [Context Link]

 

30. Fredericksen P, London R. Disconnect in the hollow state: the pivotal role of organizational capacity in community-based development organizations. Public Adm Rev. 2000;60:230-239. [Context Link]

 

31. Miller RL. Assisting gay men to maintain safer sex: an evaluation of an AIDS service organization's safer sex maintenance program. AIDS Educ Prev. 1995;7(suppl 5):48-63. [Context Link]

 

32. Barton-Villagrana H, Bedney BJ, Miller RL. The function of peer relationships among HIV prevention providers. J Prim Prev. 2002;23:217-236. [Context Link]

 

33. Van Wart M. The first step in the reinvention process: assessment. Public Adm Rev. 1995;55:429-438. [Context Link]

 

34. Scheirer MA. A template for assessing the organizational base for program implementation. New Dir Eval. 1996;72:61-80. [Context Link]

 

35. Senge PM. The Fifth Discipline: The Art and Practice of The Learning Organization. New York: Doubleday/Currency; 1990. [Context Link]

 

36. Keys CB. Organizational development: an approach to mental health consultation. In: Mannino FV, Trickett EJ, Shore MF, Kidder MG, Levine G, eds. Handbook of Mental Health Consultation. Rockville, Md: National Institute of Mental Health; 1986. [Context Link]

 

37. Preskill H, Torres RT. Evaluative Inquiry for Learning in Organizations. Thousand Oaks, Calif: Sage; 1999. [Context Link]

 

38. Jelinek M. Institutionalizing Innovation: A Study of Organizational Learning Systems. New York: Praeger; 1979. [Context Link]

 

39. Argyris C. On Organizational Learning. Oxford: Blackwell; 1992. [Context Link]

 

40. Miller RL, Bedney BJ, Guenther-Grey C, for the CITY Project Study Team. Assessing organizational capacity to deliver HIV prevention services collaboratively: tales from the field. Health Educ Behav. 2003;30(5):582-600. [Context Link]

 

41. Altman DG. Sustaining interventions in community systems: on the relationship between researchers and communities. Health Psychol. 1995;14:526-536. [Context Link]

 

42. Bracht N, Finnegan JR, Rissel C, et al. Community ownership and program continuation following a health demonstration project. Health Educ Res. 1994;9:243-255. [Context Link]

 

43. Devieux JG, Marlow RM, Rosenberg R, et al. Cultural adaptation in translational research: field experiences. J Urban Health. 2005;82(2, suppl 3): iii82-iii91. [Context Link]

 

44. Keller LO, Schaffer MA, Lia-Hoagberg B, Strohschein S. Assessment, program planning, and evaluation in population-based public health practice. J Public Health Manag Pract. 2002;8(5):30-43. [Context Link]

 

45. Trickett EJ. Content, culture, and collaboration in AIDS interventions: ecological ideas for enhancing community impact. J Prim Prev. 2002;23(2):157-174. [Context Link]

 

46. Kotellos KA, Amon JJ, Githens Benazerga, WM. Field experiences: measuring capacity building efforts in HIV/AIDS prevention programs. AIDSCAP Family Health International. AIDS. 1998;12(suppl 2):S109-S117. [Context Link]