PROGRAM EVALUATION

WHAT IS A PROGRAM?

“A program is a single. Specific purpose/activity/intervention an is ancillary to the main function of the organization.  Typically, the longevity and funding of a program are subject to internal and external factors."

Pima Country Health Department, Tucson AZ

WHAT PROGRAM EVALUATION SERVICES DO YOU OFFER?

When evaluating a program we explore your needs in 4 areas. 

1. OVERSIGHT AND COMPLIANCE

 

 

Does the funder have required evaluation elements needing to be collected and reported? These are usually non-negotiable evaluation data elements:  if you don’t provide them to your funder you will lose your funding.  We will work with your program to define these elements and then to minimize cost to you develop a plan to collect these data within your current workflow using existing infrastructure.  We are well versed with OMB and the GPRA requirements. (link to federal reporting requirements publication).

2. PROGRAM IMPROVEMENT

Do you want to ensure you are implementing your program as intended and look for ways to streamline your program delivery? We will work with you to develop your program operating procedures (if you don’t have them) and develop targeted questions for your staff and clients to help make decisions to improve your program delivery, without losing quality.  We will help you move beyond the client satisfaction surveys to gather meaningful date to guide  your programmatic decisions.  Given the limitations of current program improvement approaches we developed innovative methods to address your program improvement needs(link to RCA article for program improvement and DDB article)

3. IMPACT AND WORTH

Do you want to showcase what difference your program is making to the lives of your clients? We will help you articulate to your funder and clients your program rationale by using our proven approaches to understand the why behind your program (ref ATM articles here).  We can develop the rationale and help guide you in making sure your program targets identified underlying issues.  This is critical to avoiding activity traps: feel good activities that have no impact!  We will also ensure you are collecting data that captures the changes your program is trying to make.  We sum it all up in a logic model for your funder.  Logic models are a fairly standard requirement for many program funders.

4. KNOWLEDGE DEVELOPMENT

 

 

Does your program need research evidence to support its effectiveness?  Our staff are all doctoral level trained and well versed in the research process.  We can help design a reliable and valid study to establish the effectiveness of your program. This is often a necessary criteria for those seeking to develop a best practice/model program they want to share/market/disseminate more broadly.

PRIOR PROGRAM EVALUATION WORK

Becker, K.L. & Renger, R. (2016).  Suggested guidelines for writing reflective case narratives. American Journal of Evaluation37(4), 1-13

Bjerke, M. B. & Renger, R. (2017). Being smart about writing SMART objectives.  Evaluation and Program Planning61, 125-127. 

Coşkun, R., Akande, A., & Renger, R. (2012) Using root cause analysis for evaluating program improvement. Evaluation Journal of Australasia, 12 (2), 4-14. 

Hurley, C., Renger, R., & Brunk, B. (2005).  The applied evaluation experience:  Challenges from a student's and instructor's perspective.  American Journal of Evaluation, 26 (4), 562-578. 

Jones, E. G., Renger, R., & Kang, Y.  (2007). Self-efficacy for health related behaviors among Deaf adults. Research in Nursing & Health, 30, 185-192. 

Jones, E.G., Renger, R. & Firestone, R. (2005). Deaf community analysis for health education priorities. Public Health Nursing, 22 (1), 27-35.

Miller, H.B., Sinkala, T., Renger, R., Peacock, E.M., Tabor, J.A., Burgess J.L. (2006). Identifying Antecedent Conditions Responsible for the High Rate of Mining Injuries in Zambia. International Journal of Occupational and Environmental Health, 12, 329-339. 

Page, M., Parker, S. & Renger, R. (2009)  How using a logic model refined our program to ensure success.  Health Promotion Practice, 10 (1), 76-82.  

Renger, R., Bartel, G., & Foltysova, J. (2013). The reciprocal relationship between implementation theory and program theory in assisting decision-making, The Canadian Journal of Program Evaluation, 28 (1), 27-41. 

Renger, R., Foltysova, J., Becker, K., & Souvannasacd, E. (2015).  The power of the context map: Designing realistic outcome evaluation strategies and other unanticipated benefits.  Evaluation and Program Planning, 52, 118-125. 

Renger, R.  & Hurley, C.  (2006). From Theory to Practice: Lessons learned in the Application of the ATM approach to Developing Logic Models.  Evaluation and Program Planning, 29, 106-119. 

Renger, R. (2006)  Consequences to Federal Programs when the logic modeling process is not followed with fidelity.  American Journal of Evaluation, 27 (4), 452-464. 

Renger, R.  Constructing and verifying program theory using source documentation. (2011). The Canadian Journal of Program Evaluation, 25 (1), 51-67. 

Renger, R.  The need for a deeper level of accountability – and how we might get there.  The National AHEC Bulletin, 19(2), 10-13. 

Renger, R. Unterrichtsevaluation. (2013). Schulmanagment: Die Fachzeitschrift fur Schul- und Unterrichtsentwicklung, 1 (February), 32-34. 

Renger, R., & Bourdeau, B. Strategies for Values Inquiry:  An Exploratory Case Study. American Journal of Evaluation, 25(1), 39-49.

Renger, R., & Titcomb, A. (2002). A three-step approach to teaching logic models.   American Journal of Evaluation, 23(4), 493-503. 

Renger, R., Cimetta, A., Pettygrove, S., Rogan, S. (2002). Geographic Information Systems (GIS) as an Evaluation Tool.  American Journal of Evaluation, 23(4), 469-479. 

Renger, R., Page, M., & Renger J. (2007). Illustrating the simplicity of the logic modeling process through the eyes of an eight year old.  The Canadian Journal of Program Evaluation, 22 (1), 195-204. 

Renger, R., Passons, O., & Cimetta, A. (2003). Evaluating housing revitalization projects:  Critical lessons for all evaluators. American Journal of Evaluation, 24(1), 51-64. 

Renger, R., Steinfelt, V., Lazarus, L. (2002). Assessing the Effectiveness of a Community Based Media Campaign Targeting Physical Inactivity.  Family and Community Health, 25(3), 18-30. 

Renger, R. (2014). Contributing factors to the continued blurring of research and evaluation:  Strategies for moving forward.  The Canadian Journal of Program Evaluation, 29 (1), 104-117. 

Withy, K. M., Lee, W., & Renger, R. (2007). A practical framework for evaluating a culturally tailored adolescent substance abuse treatment program.  Ethnicity and Health, 12 (5) 1-14. 

© 2019 by JUST Evaluation Services