Storytelling Through Data – A Case Study in Evaluation

Updated: Aug 16, 2021

“Stories give life to data, and data gives authority to stories.”

~ Wendy Newman


By our very nature, data analysts and program evaluators are storytellers. We design databases and survey tools, collect data, clean data; but at the end of the day, we want the data to tell the story of what we’ve been studying – to answer questions, demonstrate value, and generate conversation amongst stakeholders.


Program evaluation is used for a variety of purposes – not just to demonstrate whether a program was effective[1]. Beyond this, there are numerous reasons why evaluation is a meaningful component of any project, a few of which may include:

  • Identify ways to improve your program along the way and make adjustments in real time

  • Compare project outcomes over time

  • Justify project funding for current and future work

  • To help an organization develop budgets, facilitate long term planning, and identify training and technical assistance needs

  • Promote the program to increase program participation


Utilizing Mixed-Methods to Tell the Tale


As researchers, we are familiar with the value of employing mixed-methods techniques when designing a research project, and this holds true for evaluation designs as well. Mixed methods, when you employ both quantitative and qualitative data collection methods, has several advantages to the researcher and provides a more holistic understanding of a project or problem than one method alone. In program evaluation projects, quantitative data is usually a given – having basic quantitative data such as participant counts, retention numbers, and completion rates provides insight into the reach of the program and is typically required by a funder as part of standardized reporting.


Qualitative data in evaluation gives the numbers depth and meaning – what kind of impact the program components have on participants, how their increased knowledge from an intervention caused their behavior to change, why they continued to participate in a program. The use of mixed methods can offset potential weaknesses encountered when utilizing quant or qual alone. For more insight into the benefits of utilizing mixed methods in a project, check out the links at the bottom of this post.


A Case Study in Evaluative Storytelling


Recently, our team was fortunate to work as evaluators on a large project aiming to improve the social and emotional wellbeing of educators during COVID-19. Due to the volatile, ever-shifting nature of the pandemic response, the project team – and by extension, the evaluation team – became adept at pivoting in order to increase the reach and impact of the various program components and make sure the available resources were delivered to the educators who needed it.


As the project progressed, it became clear through testimonials, written feedback, and interviews with participants that the program was of significant value to participants. However, the quantitative data alone did not capture the depth of this impact, so our team looked for additional ways to tell the story of the project. When developing the final report for the project, we began by asking what and for whom:

  • What is the purpose of the report? Or more specifically, which of the purposes described above will this report highlight?

  • Who will be the audience of the evaluation report?

The what aimed to demonstrate the impact of the program on participants, so we chose to tell the project’s story through the voice of the participants via a short testimonial video in addition to an interactive, flip-book style report. By having two compelling visual materials to share, the organization will be able to demonstrate the impact of the project quickly, professionally, and succinctly to the who – current program funder, future program partners, stakeholders, and potential funders.


Sample Evaluation Project Timeline


The video was designed to introduce stakeholders to the project team and hear in their own words the objectives and outcomes of the project. Additionally, video testimonials from program participants demonstrated the impact of the project in a way that reports, numbers, and charts cannot – by hearing the story directly from those most impacted by it.


“Reports convey information. Stories create experience. Reports transfer knowledge. Stories transport the reader, crossing boundaries of time, space, and imagination. The report points us there. The story puts us there.”

~ Roy Peter Clark


The report focused on blending the quantitative data with meaningful qualitative data, such as quotes from participants or project staff members, and packaging it in a succinct and memorable way with iconography and visuals. Additionally, to give the reader perspective and insight into the environment under which the program took place (in this instance, the COVID-19 pandemic) our team developed a timeline to demonstrate the resilience and adaptability of the project team throughout the course of the project.



Telling Your Client’s Story


Utilizing a mixed-methods evaluation approach and being poised to pivot at a moment’s notice throughout the life of this project allowed us to capture the true, meaningful impact on the program participants and provide our client with rich data and a strong visual story to share with their partners, funders, and future stakeholders as they move forward with their vital work.


“Human beings share stories to remind each other of who they are and how they should act.”

~ Jonah Sachs


Resources:

Introduction to Mixed Methods in Impact Evaluation

CDC Coffee Chat: Using Mixed Methods in Program Evaluation

Introduction to Program Evaluation for Public Health Programs





[1] Introduction to Program Evaluation for Public Health ProgramsCDC.gov