Empowering Nonprofits Through Data

Funders & Evaluation: A Tension or Opportunity?

This sentence from Prentice Zinn’s December 2017 AEA365 blog post really piqued my interest.

Zinn discusses common areas of tension such as lack of funding for evaluation and outcomes anxiety. I appreciate this perspective, and believe some tension may stem from the feeling that evaluation is a mandate rather than something grantees want to do. The difference between ‘must’ and ‘want’ is the difference between ‘tension’ and ‘willingness’.

How does the shift happen? Build a culture of evaluation.

Funders who embrace program evaluation as a learning opportunity have the chance to partner with grantees to explore continuous program improvement together.

When outcomes are created collaboratively, data collection tools are agreed upon, and grantees are compensated for the extra time required to engage in evaluation activities, tension can dissipate.

In order for this to happen though, all parties must believe that the reason for program evaluation is to learn how to best meet the needs of the people being served. There must be a mutual willingness to shift away from thinking ‘we have to do evaluation because it’s required’ to a new mindset of ‘we want to do evaluation to learn’.

For more reading about how to begin to build a culture of evaluation right now, check out our free white paper Building a Culture of Evaluation.

Data Visualization + Evaluation Findings = Effective Reporting

Visualizing Data

How I write program evaluation reports has evolved over the years. Text and tables is what I was taught, with lots and lots of details – resulting in lots and lots of pages.

Fast forward a few years.

I still write the traditional technical report, with a little more visualization than in the past. I also write/design an impact report collaboratively with the client. As a part of communicating the findings publicly, we pull important data, how they will use the data, and integrate visualizations throughout. (shout out to @Evergreendata for teaching me the visual ways).

I am not a graphic designer. I create a design concept; placing the content in tables with lot of design notes. Such as ‘put a bubble graph here’ and ‘place a photo of a manufactured home park here’. I also share the client’s logo and brand colors, to ensure the stakeholder report reflects the organization.

We are currently doing a project with The Meyer Memorial Trust (MMT) to conduct a cross-site evaluation of MMT’s Affordable Housing Initiative (AHI) Manufactured Home Repair Program (MHRP). This year one impact report summarizes year one findings, including participation, outcomes, overall impact, successes and challenges in 8 visual pages.

Download the Report


Upcoming Event: Logic Model Workshop

Join me on Tuesday, February 5, 8:30 – 10am for an interactive workshop on logic models. Hosted by WVDO.  


Resources

Some of my favorite resources when it comes to reporting writing and/or data viz.

 

Innovative Evaluation Reporting

Evaluation Mini-Guide Series: Data analysis and Reporting.

 

Presenting Data Effectively

-Stephanie Evergreen

Creating Measurable Outcomes

Measurement

Outcome statements are change statements. They are critical to ensure you are collecting data that will inform program improvement efforts. They address the key question:

What do you expect will change as a result of program activities?

The statements provide the foundation from which all data collection questions will stem. Too often, organizations jump into creating surveys without outcome statements. The result can be asking questions that have nothing to do with understanding program impact or measuring progress. Like throwing darts at a board, hoping something will stick.

The process of creating measurable outcomes requires time, planning and collaboration. They provide direction, so when you’re ready to collect data what you ask is aligned to the outcome statements. You’ll throw darts, and hit a bullseye.

Alignment Example: Program activities & outcomes

What Does the Program Do? What Change is Expected as a Result?
Program Activity Outcome Outcome
Provide math and science classroom activities for at risk students Students will improve their attitude toward math and science Students will increase their interest in math and science

At a bare minimum, development and program staff come together to create these statements. Ideally, others are at the table depending upon the size of your organization. Development staff members can use outcome statements in their grant proposals and other fundraising activities as applicable. Program staff typically are the ones to collect the data. The result – development staff have the data they need to report to funders, and program staff have data they need to understand program successes and challenges. Bullseye!

“Outcome statements provided a concise direction on what change we want in our 4th and 5th grade participants. The process of creating measurable outcomes helped our program and development staff members literally get onto the same page as to what impact we wanted to have. This made funder reports easier to generate, and understanding where to hone our program plan clear.”

Want More?

Resource corner: Evaluation Mini Guide Series

This brief mini guide series goes into more detail on several evaluation subjects. The first one highlights how to create logic models and measurable outcomes.

Get the Guide

Collaboration in Action

Collaboration is a reoccurring theme in what I tell my clients to do. It’s just as important for me as a practitioner to continue to collaborate with fellow evaluators. I recently came across the Evaluation Wrecking Crew on an AEA 365blog post tip of the day. I’ve worked with several Science, Technology, Engineering and Math (STEM) organizations, and was intrigued by the following excerpt:

“We have joined forces to: 1) build a CS/STEM repository of evaluation instruments and approaches; 2) build a common hub for the community, with the assistance of Oak Ridge Associated Universities; and 3) educate the CS community about the value and role of evaluation to improve the quality of CS and STEM education.  We meet biweekly using Zoom video-conferencing software.”

Thanks to the American Evaluation Association (AEA) for creating so many opportunities for evaluators to connect and collaborate. It’s true. On my own, I’m a drop in a bucket. Working alongside other evaluators toward a common goal, we may create an ocean.


Read:

Now we have an amazing way to measure the physical, emotional, and mental effects of our programs...

Chari accurately captured the fundamental goals and mission of our organization and transformed our input into a clear evaluation process that helps us assess the impact of our programs on the lives of the families that we serve. Now we have an amazing way to measure the physical, emotional, and mental effects of our programs and to guide change, ensuring that we are delivering services in the most effective way possible.

Brandi Tuck, Executive Director, Path Home

Get Insights & Updates!

Get periodic emails with useful evaluation resources and industry updates.

We promise not to spam you!