skip to Main Content

Program Evaluation: Tips for Design, Implementation, and Evaluation (Part 2 of 2)

by Tim Walter, Arapahoe County Early Childhood Council

This blog entry is the second of two posts written by Tim Walter on Data and Evaluation. You can read the first post here.

The ultimate purpose of evaluation is to show, through imagery and visuals, the “good work” our agencies perform.  Big Data is only as effective as our ability to summarize and message to the public via attractive and innovative visuals.  The average person only takes a few seconds to process and draw conclusions about information – program evaluation within an agency must begin to move towards effective messaging as grant funds continue to become more competitive.

Program Evaluation is several jobs rolled into one.  It is becoming more necessary for evaluators to possess multiple skills: research, grant writing, statistics, computer database design, strategic planning, data analysis, and graphic design (among others).

My hope is to provide a few tips to jump start your evaluation efforts.  Through discussions with non-profit councils, I’ve realized that not every agency is equipped with an Evaluator or Data Manager; however, through collaboration we can begin to improve child and family evaluation in Colorado.

The three components of program evaluation consist of: (1) Design (2) Implementation (3) Evaluation.

Often the term “evaluation” is used to summarize these three components, yet they are distinct and we must be able to develop each component independently with flexibility, while remaining mindful of how all the components will ultimately interact.

Design:

The design phase gives us the opportunity to identify “what to track” in order to measure a program’s effectiveness.  Generally, we can find what we need to measure (ie: the “primary asks”) within our program’s grant.  If a grant is well written, there may be a section titled “Measurables” or “Outcomes Desired.” These are great starting points for developing program evaluation.  Additionally, we may have “secondary asks,” which generally come from stakeholder/board member or collaborative groups our agencies partner with.  I believe these are of secondary importance, because we are not legally obligated to report on our stakeholder/board member or collaborative group asks (as they are not “primary asks”).  Therefore, we must develop systems for tracking and remain accountable to our program grants or “primary asks,” which we are legally required each year to report on.  Consider creating a “hierarchy of asks,” and it will become easier to see in which category a particular ask falls (ie: grant, stakeholder, board member, or collaborative ask).  Ultimately, it may be very possible to tuck “secondary asks” into the existing “primary asks” identified within your program grants.

As part of the design also consider using a Logic Model (a visual representation of the program) because it will allow you to identify the following:

(1) “Goals” of the program

(2) “Short” and “Long” term goals

(3a) “Activities” you/staff will be required to perform

(3b) “Indicators” which tell you how the “Activities” are progressing. Generally these are total numbers (ie: “the total number of individuals served in a program” or “the total number sites that have increased site rating levels from 2016 to 2017)

(4) “Inputs,” which are all the materials/staff that make a program function.

Logic Models should be be updated yearly and should be flexible enough to allow you to tuck in any new “primary” or “secondary” asks.

Implementation:

Now that we’ve identified “what to track,” we need to consider “how to track it?”  The implementation phase should focus on establishing tools to track our program outcomes (activities and indicators) for grant funders and creating protocol for staff to follow to track activities accurately and consistently for reporting.  By using our Logic Models we can visually outline and match all of our program “activities” to “indicators.”  Activities are generally outlined in our program grants (ie: “report the total number of participants served in 2017”) and should be accompanied with an Indicator (ie: “the total number of participants served in 2017”).  This will allow us to easily identify our program number totals for reporting.  Remember to create tracking protocol for staff – often program grants outline tracking protocol (ie: “at participant enrollment of program, complete a pre-survey assessment with the caregiver”); however, it may be necessary for us to create additional protocol for data entry purposes.  Inaccurate or inconsistent data entry will negatively impact our data integrity and by taking the time to implement staff data entry protocol we can eliminate future data errors and maintenance.

Generally, Microsoft Excel or Access are sufficient to track program activities and will allow us to quickly calculate the total numbers (“indicators”).  Attempt to avoid using MS Word, as it does not easily allow us to total numbers without spending additional time on manual counting.  If our agency programs already use an existing database for tracking, we should always attempt to utilize these for pulling number totals.

Evaluation:

Lastly, if we have successfully identified our various “asks” and set up efficient systems to track, we can develop our program messaging and communication.  As grants become more competitive, we must be able to “show the good work we do” in an innovative and concise manner.  Consider using or producing maps to geographically display who/where we are serving, or infographics (ie: graphs, bar charts, pie charts and tables) to summarize and display our program’s impact.  Free and low-cost mapping and infographic options, such as ArcGIS, QGIS, Piktochart, and PowerPoint/Excel exist for agencies working from a limited budget.

At the Arapahoe County Early Childhood Council (ACECC), we owe much of our initial evaluation and data development efforts to the ECCP Mini-Grant (2016).  As we move forward towards program messaging and communication (a focus for 2017-2018), we are excited to collaborate with the ECCP and other councils to further develop evaluation efforts and innovative messaging techniques in an effort to improve child and family outcomes in Colorado.

If you would like to discuss program evaluation and data reporting, please feel free to contact: Tim Walter, Program Evaluator and Data Analyst at ACECC at tim@acecc.org.  We look forward to future collaborations!

Back To Top