Take Home Lessons from #EVAL13

Customer Satisfaction

The American Evaluation Association national conference in Washington D.C. just wrapped up last week. It is the largest annual evaluation focused event with about 3,000 evaluators this year. It was an enlightening experience for me and left me wondering why I don’t go every year.

These evaluator people are friendly and have an obsession with improvement- Almost everyone I met was friendly and helpful. There was a focus on how to help each other do our work better: data visualizations, reports, presentations, and even blogs. That makes sense since this is arguably our main task as evaluators: How can we help make programs, projects, and products better?

I attended the Data Visualization TIG meeting which was filled with comedy and insight. Many of the affiliated evaluators blog, which encouraged me to move past obsessing over picking my WordPress theme and to start posting.

It is refreshing to attend a conference and be immersed in new ideas and reminded of old ideas that need to be revisited and reworked. I also realized that I can get my evaluation intellectual dose on a regular basis through ongoing communication and contact with other evaluators through their twitter, blog and other posts.

Strive to increase the utility of evaluation reports- It is a rare soul that reads the entire evaluation report after I submit it. So how can I put the information into a more digestible format? How can I provide more concise feedback and that takes less time to understand.

One presenter suggested an executive summary style report with all or most of the results showing up in an appendix. To top it off she develops a 1-pager with visuals that summarize the main points. I think that approach makes sense most of the time, it will largely depend on the stakeholders, their backgrounds and roles on the project. I am going to give it a try on a report I just developed.

In an effort to create a visual product that is more accessible to stakeholders we developed what we called a Visual Evaluation Report. Essentially, it is a video summary of a domestic violence and high conflict divorce program, Bridging Families and Communities, as well as a review of the evaluation findings. Check out the YouTube video here, I would love to hear some feedback.

Improve the visual display of quantitative results- There were tons of examples of “good” versus “bad” graphs at the conference talks. Most of the time our products are not shared with researchers and evaluators but with stakeholders with limited quantitative experience. The design of a graph is key to its interpretability. The colors chosen, the layout, labeling… everything.

The fact is, while the APA style guide and accompanying resources might have suggestions in this respect, we should strive to continually make our visualizations better. For instance, I liked the idea of examining palettes of colors that work well together and considering how color-blind individuals would visually process colored figures. So the design of my bar charts will not be a stagnate template but instead will be constantly improving.

Also, consideration of how evaluation reports that use colors in figures will look when printed in black and white is important. We have all seen it and the product can be a disaster. Sometimes it is worth it to print high quality color reports and send them to stakeholders directly. Many people still prefer to print out reports.

Cost-effectiveness is a perennial evaluation issue– I took a 3 hour cost-effectiveness workshop with Edward Broughton, Director, Research and Evaluation, USAID-ASSIST & USAID-Health Care Improvement Project. We examined methods for determining the true costs of initiatives. Funders typically want to know if the programs they fund make an impact in a more cost-effective way compared to other approaches.

There are innovative approaches to assess the relative value of programs that strive toward “social equality, environmental sustainability and wellbeing” such as the methods discussed by such groups as the Social Return on Investment Network (http://www.thesroinetwork.org). I am excited to incorporate this methods into my practice.

For me, there are many reasons to travel to Denver. Visiting my friends from high school, college and beyond who have ended up in that region, hiking at Chautauqua in Boulder, and walking around downtown Denver, to name a few. Next year in October, #EVAL14 is added to my list.


James Pann smiling at the camera, sitting in front of green trees

James Pann, Ph.D. is a Professor at Nova Southeast University and a highly experienced psychologist and evaluator with nearly 25 years of experience. He conducts research and evaluation projects with non-profit organizations in the fields of health, human services, and education, and has received funding from multiple government agencies.

James holds multiple degrees including a Ph.D. in Counseling Psychology, an M.S. Ed. from the University of Miami, and a BBA in Accounting from the University of Texas at Austin. He is also the host of the EvalNetwork podcast, a frequent conference presenter, and has published several peer-reviewed research articles and co-authored a book. James currently resides in Miami, Florida with his family and enjoys backpacking trips. Find out more about his work here.


Retrospective pretest designs

The Power of Retrospective Pretests to Address Common Survey Research Challenges

James Pann interviews Melanie Hwalek, Ph.D., a program evaluation consultant, to discuss the retrospective pretest (RPT) design, focusing on its practical applications and the findings from her recent research detailed in the paper, “Designing a Questionnaire with Retrospective PrePost Items: Format Matters.” RPT is particularly useful for evaluating changes in participants’ perceptions or self-assessments following

Read More »
How to Determine the Impact of Your Mindfulness Program

Program Evaluation of Mindfulness Projects Using the CIPP Framework and Logic Models

Introduction Mindfulness practices have a rich history, largely rooted in ancient Hindu and Buddhist traditions, and spread to the West through cultural exchange and immigration. From its influence on 19th-century Transcendentalists to its modern applications in medicine, education, and business, mindfulness has evolved into a widely recognized and practiced approach for stress reduction and holistic

Read More »
Empowerment Evaluation

Empowering Change: David Fetterman on Using Evaluation to Build a Better World

David Fetterman is a leading expert in empowerment evaluation, an approach that emphasizes collaboration, participation, and capacity building. He has written extensively on the topic, and his work has been used in a wide range of settings, including government agencies, non-profit organizations, and businesses. David focuses on helping people evaluate their programs and initiatives. Moreover,

Read More »