skip to Main Content

Ensuring the Colorado Shared Message Bank and a Community for Communicators Grows and Thrives

As many in the early childhood community know, the Early Childhood Colorado Partnership (ECCP) worked intentionally to advance the shared goals within the Early Childhood Colorado Framework for many years. Developing and sustaining a comprehensive early childhood system with children and families at the center was no easy task, and no one individual, organization or agency can do it alone. Together, their collective efforts facilitated numerous engagements and tools that continue to support families and organizations today. However, with numerous organizations focused on early childhood health and education, the ECCP Steering Committee decided earlier this year to discontinue the work and create space for others to grow and thrive.

As the ECCP dissolved, Illuminate Colorado, a statewide 501(c)3 organization dedicated to strengthening families, organizations and communities to prevent child maltreatment, has stepped in to ensure that the Colorado Shared Message Bank and the community that ECCP supported for communicators focused on promoting research-based frames and messages proven to garner broad public support for issues connected to healthy early childhood development, the prevention of adversity, mitigation of toxic stress and promotion of resilience and strengthening families continues to exist and grow into the future.

The Colorado Shared Message Bank is a critical tool in the toolbox to aid people in shifting community norms, perceptions and policy toward embracing the importance of investing in children and strengthening families and communities. It is also product of a collaborative effort of many people trained to support partners around the state as they work to integrate the Shared Message Bank and communications best practices into their communications efforts with and on behalf of children and families. These mentors review partner materials and offer advice on framing to promote social change or engage Coloradans in services that strengthen families.  

As additional frameworks to support maternal health and child maltreatment prevention have been created in Colorado and the community focused on improving communications practices begins to reconvene to share what is working and learn from one another, Colorado offers a message of gratitude to the Early Childhood Colorado Partnership for providing the space and conditions for diverse partners across the comprehensive early childhood system – encompassing health, mental health, family support and early learning – to come together to identify common results, share best and innovative practices, and implement strategies to improve system effectiveness for and with child and family well-being.

Join the community of communicators to find support in promoting messages that strengthen families. 

Program Evaluation: Tips for Design, Implementation, and Evaluation (Part 2 of 2)

by Tim Walter, Arapahoe County Early Childhood Council

This blog entry is the second of two posts written by Tim Walter on Data and Evaluation. You can read the first post here.

The ultimate purpose of evaluation is to show, through imagery and visuals, the “good work” our agencies perform.  Big Data is only as effective as our ability to summarize and message to the public via attractive and innovative visuals.  The average person only takes a few seconds to process and draw conclusions about information – program evaluation within an agency must begin to move towards effective messaging as grant funds continue to become more competitive.

Program Evaluation is several jobs rolled into one.  It is becoming more necessary for evaluators to possess multiple skills: research, grant writing, statistics, computer database design, strategic planning, data analysis, and graphic design (among others).

My hope is to provide a few tips to jump start your evaluation efforts.  Through discussions with non-profit councils, I’ve realized that not every agency is equipped with an Evaluator or Data Manager; however, through collaboration we can begin to improve child and family evaluation in Colorado.

The three components of program evaluation consist of: (1) Design (2) Implementation (3) Evaluation.

Often the term “evaluation” is used to summarize these three components, yet they are distinct and we must be able to develop each component independently with flexibility, while remaining mindful of how all the components will ultimately interact.

Design:

The design phase gives us the opportunity to identify “what to track” in order to measure a program’s effectiveness.  Generally, we can find what we need to measure (ie: the “primary asks”) within our program’s grant.  If a grant is well written, there may be a section titled “Measurables” or “Outcomes Desired.” These are great starting points for developing program evaluation.  Additionally, we may have “secondary asks,” which generally come from stakeholder/board member or collaborative groups our agencies partner with.  I believe these are of secondary importance, because we are not legally obligated to report on our stakeholder/board member or collaborative group asks (as they are not “primary asks”).  Therefore, we must develop systems for tracking and remain accountable to our program grants or “primary asks,” which we are legally required each year to report on.  Consider creating a “hierarchy of asks,” and it will become easier to see in which category a particular ask falls (ie: grant, stakeholder, board member, or collaborative ask).  Ultimately, it may be very possible to tuck “secondary asks” into the existing “primary asks” identified within your program grants.

As part of the design also consider using a Logic Model (a visual representation of the program) because it will allow you to identify the following:

(1) “Goals” of the program

(2) “Short” and “Long” term goals

(3a) “Activities” you/staff will be required to perform

(3b) “Indicators” which tell you how the “Activities” are progressing. Generally these are total numbers (ie: “the total number of individuals served in a program” or “the total number sites that have increased site rating levels from 2016 to 2017)

(4) “Inputs,” which are all the materials/staff that make a program function.

Logic Models should be be updated yearly and should be flexible enough to allow you to tuck in any new “primary” or “secondary” asks.

Implementation:

Now that we’ve identified “what to track,” we need to consider “how to track it?”  The implementation phase should focus on establishing tools to track our program outcomes (activities and indicators) for grant funders and creating protocol for staff to follow to track activities accurately and consistently for reporting.  By using our Logic Models we can visually outline and match all of our program “activities” to “indicators.”  Activities are generally outlined in our program grants (ie: “report the total number of participants served in 2017”) and should be accompanied with an Indicator (ie: “the total number of participants served in 2017”).  This will allow us to easily identify our program number totals for reporting.  Remember to create tracking protocol for staff – often program grants outline tracking protocol (ie: “at participant enrollment of program, complete a pre-survey assessment with the caregiver”); however, it may be necessary for us to create additional protocol for data entry purposes.  Inaccurate or inconsistent data entry will negatively impact our data integrity and by taking the time to implement staff data entry protocol we can eliminate future data errors and maintenance.

Generally, Microsoft Excel or Access are sufficient to track program activities and will allow us to quickly calculate the total numbers (“indicators”).  Attempt to avoid using MS Word, as it does not easily allow us to total numbers without spending additional time on manual counting.  If our agency programs already use an existing database for tracking, we should always attempt to utilize these for pulling number totals.

Evaluation:

Lastly, if we have successfully identified our various “asks” and set up efficient systems to track, we can develop our program messaging and communication.  As grants become more competitive, we must be able to “show the good work we do” in an innovative and concise manner.  Consider using or producing maps to geographically display who/where we are serving, or infographics (ie: graphs, bar charts, pie charts and tables) to summarize and display our program’s impact.  Free and low-cost mapping and infographic options, such as ArcGIS, QGIS, Piktochart, and PowerPoint/Excel exist for agencies working from a limited budget.

At the Arapahoe County Early Childhood Council (ACECC), we owe much of our initial evaluation and data development efforts to the ECCP Mini-Grant (2016).  As we move forward towards program messaging and communication (a focus for 2017-2018), we are excited to collaborate with the ECCP and other councils to further develop evaluation efforts and innovative messaging techniques in an effort to improve child and family outcomes in Colorado.

If you would like to discuss program evaluation and data reporting, please feel free to contact: Tim Walter, Program Evaluator and Data Analyst at ACECC at tim@acecc.org.  We look forward to future collaborations!

Data and Evaluation: Sifting Through the Static and the Sound (ECCP Mini-Grant Highlight, Part 1 of 2)

by Tim Walter, Arapahoe County Early Childhood Council

Big Data, Data Analytics, Data Driven Analysis – do these “buzz-words” sound familiar?  We have entered the age of “Data” (…and whatever else you’d like to attach to the beginning or end of the word…), and there is no going back.  Data can be overwhelming – hard to determine what is important and what is just plain confusing.  Data is not all bad, in fact it can be what helps drive and guide program design, implementation, and evaluation efforts.

Program Evaluation for non-profit agencies and organizations is possibly more important now than ever before.  Every year non-profit agencies and organizations prepare to write and apply for new and old grants alike; however, what makes your non-profit work so special?  Can you show the good work your organization does (and whether it is producing effective results)?

You might be asking: Why is data and evaluation of my program important and what does it have to do with my work?

“Program Evaluation” and data analysis will continue to be the standard by which organizations are judged.  For many years, science-driven professions (Engineering, Accounting, Medicine) have been required to quantify and show accountability when presenting results.  Only more recently, the social sciences (Public Health and Social Work) have been asked to quantify and show accountability with the work they do.  It is now, more than ever, necessary for organizations to “Show Their Good Work” as grant funders want to see “accountability” when awarding dollars to organizations.

The ECCP Mini-Grant supporting the Arapahoe County Early Childhood Council kicked off an inter-agency re-assessment of how we tracked our program outcomes in past years, and we began to ask,  what “truly matters” for tracking, as we move into 2017-2018 and beyond.  Along the way, we have developed Logic Model templates (modeled after Results Based Accountability theory) to identify program outcome goals and database design needs.  Our ultimate goal is to create a data tracking system capable of producing images and visual reports that will show our good work.  There will be a variety of other uses for this information, but the process of designing a robust evaluation methodology was far more manageable than previously thought.

Program Evaluation may seem daunting – but my hope is to provide a few more opportunities (and tips) that will provide others easy to use ideas/templates, design tips, and an ability to bring to life the good work being done.

Read more in part 2 of this blog post, coming later this month!

Early Childhood Partners Develop Shared Data Agenda to Collectively Move Needle for Children and Families

The Data Agenda was launched at the ECCP full partner meeting in January.

Data-driven decision making is a key element to ensuring collaborative efforts can be accountable to reaching shared results. The Early Childhood Colorado Partnership has worked since 2009 to continually use population-level data to inform efforts and move the needle to achieve results for children and families in Colorado. The Data Action Team, made up of partners from multiple organizations and agencies, discusses and collects data annually using a Results Based Accountability (RBA) approach, and spent the better part of 2016 updating the Data Agenda to reflect the Early Childhood Colorado Framework, updated in 2015.

The Data Agenda includes “systems” indicators at the performance level to better understand collective impact upon outcomes. Systems Performance Indicators provide shared accountability for and reflection of how the early childhood system is performing,rather than reflect whole populations (i.e. children and families in Colorado). A Data Advocacy Agenda is in development so partners can continuously work together to advocate for improved data collection.

As state and local partners identify ways to align efforts to move the needle for children and families, the Data Agenda provides an opportunity for shared accountability. The Data Action Team will work throughout 2017 to capture baseline data and share the story behind the data broadly.

Want to join this work? Email earlychildhood@civiccanopy.org to learn more.

Back To Top