Three Essential Elements for Evaluating Systems Initiatives

You may be hearing terms like systems change, collective impact, and multi-sectoral initiatives a lot these days to describe situations where multiple actors are working together to tackle complex social problems. If you’re working or considering working within one of these types of initiatives, it’s critical to first recognize the complexity of the system (or set of interconnected parts) you’re working with and to think about what success would look like within that system. Although systems are complex, systems thinking – and the corresponding evaluation design – can be broken down into three essential elements [1]:

  1. Understanding interrelationships between the actors involved and between the desired outcomes, 
  2. Engaging with multiple perspectives to see where an intervention has different purposes or framings, and
  3. Reflecting on boundaries, i.e., the choices made when deciding what is important and not important to measure.

In this blog, we share how we’ve applied these elements to evaluation design for two different types of systems initiatives in our work with Healthier New Brunswick and the New Brunswick Community Food Alliance (NBCFA).

Example #1: Healthier New Brunswick

Healthier New Brunswick is a city-wide alliance of organizations that work together to improve the health and well-being of its residents, through various activities and programs. Led by Johnson & Johnson, Rutgers Robert Wood Johnson Medical School, New Brunswick Tomorrow, and the City of New Brunswick this alliance brings together the nonprofit, private, government, higher education, and healthcare sectors with the participation of both large and small organizations within the community.

TCC Group partnered with Healthier New Brunswick to increase the evaluative capacity of the alliance and to develop a shared measurement system which would enable the various organizational partners to align under a common set of indicators of success. Using the three elements of systems thinking helped us think about our evaluation approach in the following ways:

1. Understanding interrelationships – In order to create a set of measurements that would align a wide range of stakeholders we first needed to understand the strength of the interrelationships between organizations apart from their alliance participation, and the extent to which outcomes data was shared across organizations. Using workgroups comprised of New Brunswick stakeholders, we sought to understand how the organizations were already working together and where data was already being collected throughout the community.



In most instances, the only central source of data collection was a bi-annual survey administered by one of the local nonprofits. Otherwise, data collected by nonprofits that worked with domestic violence programs, or agencies that administered the Supplemental Nutrition Assistance Program (SNAP), for example, were not shared more broadly with the community. This meant that while various organizations were working for the same outcomes, their lack of interaction and data sharing limited their ability to see the interrelationships between their programs. These insights served as an important starting point for understanding the feasibility of data collection across New Brunswick and the need for a central mechanism for compiling community data on residents’ well-being.

2. Engaging with multiple perspectives – We understood that “buy-in” across the sectors would be critical to a successful shared measurement system for the alliance. Therefore, a series of convenings and workgroups were used to develop the shared measurements and these always included a diverse group of participants – those that were addressing health and well-being from multiple perspectives. 



There were groups who brought expertise in nutrition and health, such as nutritionists, farmers’ market organizers, pediatricians, and dentists, as well as those representing medical institutions like medical schools and hospitals. There were also other stakeholders who had different perspectives on community members’ needs, such as homeless program directors, city residents and officials, faith-based organizations, researchers, daycare/preschool providers, business owners, public school faculty, and other nonprofit service providers.

This diverse participation in the convenings and workgroups helped us understand the priority health issues from a multitude of residents’ perspectives. These participants also shed light on areas where we should prioritize indicators, such as understanding the extent to which limited English proficiency was acting as a barrier for members to access healthcare, and how environmental components such as neighborhood safety connected to residents’ mental health. This comprehensive picture of how residents experience health issues within New Brunswick led to a more robust collection of community indicators that included the social determinants of health specific to New Brunswick.

3. Reflecting on boundaries – Finally, we reflected on the boundaries that our set of shared indicators was drawing (i.e., what would and would not be a part of the shared measures) and what that would mean for New Brunswick. 



As the core set of indicators was developed around three specific health issues (mental health, nutrition, and access to care), we also included a qualitative indicator, which allowed organizations and workgroups in the alliance to report how they were contributing and seeing improved well-being in areas beyond the core indicators. To gather this data, workgroups would be asked about what changes they have seen in residents’ well-being. This broad-based indicator would supplement the agreed upon shared measures and would more fully capture the breadth of work happening in New Brunswick to influence health and well-being outcomes. Without these supplemental indicators, the alliance would have had a gap in their understanding of the community’s work, and many significant stakeholders would have been left out of the shared measurement process.

 

Example #2: The New Brunswick Community Food Alliance

The New Brunswick Community Food Alliance (NBCFA) is an all-volunteer group of individuals that aim to make the local food system work for everybody in the community—and beyond. NBCFA works through a handful of targeted issue-specific workgroups  [2] to influence a system with a large number of actors and relationships—the New Brunswick food system. Numerous stakeholders are involved in food production and consumption processes—farmers, sellers, consumers, and everyone in between. As a result, building their capacity to evaluate food systems change would be a complex task.

In this case, TCC Group was brought on board to increase the evaluative capacity of this alliance. By working with each of their individual workgroups, we integrated the three elements of systems thinking as follows:

1. Understanding interrelationships – In order to build their systems evaluation capacity, we needed a strong understanding of the interrelationships of all the main food system actors. For example, local food growers, such as honey farmers, interact with consumers when they sell their goods; coordinate with organizers of farmers markets; and are affected by the work that local policymakers do when they develop food policy from the top-down. Similarly, school cafeterias interact with a range of groups including students, parents, and vendors. 



After cataloguing these relationships and many others, we laid out a series of logic models for each of the five working groups. This exercise allowed NBCFA to see which stakeholders the workgroups were working with, where there were gaps, where any overlap occurred, and whether any key stakeholders were left out.

2. Engaging with multiple perspectives – We engaged multiple perspectives throughout this process. Workgroup members brought perspectives from their work with other nonprofits, the local medical school, and other institutions. While developing the logic models, we brought in key members of each working group to provide input. These members had great expertise on the subjects of their respective groups. 



For example, members of the Food Economic Development Workgroup understood the many issues affecting local residents who wanted to start food businesses while the Agriculture Workgroup had a great deal of knowledge about the types of farming that have been traditionally successful in the New Brunswick area. Based on these insights, we were able to include somewhat technical outcomes on the logic model that we would not have otherwise known about. For example, we were able to include increased knowledge of policy requirements, financing options, and insurance so that progress on those outcomes could ultimately be assessed.

3. Reflecting on boundaries – We then reflected on the boundaries of the desired systems change for the evaluation. In this case, that reflection meant delineating which types of systems change NBCFA felt they should and should not hold themselves accountable for. 



For example, they could hold themselves accountable for outputs like the number of heirloom seed packets they distributed and outcomes like the number of new local policies addressing the food system. On the other hand, they could not expect their work to immediately increase the environmental sustainability of New Brunswick farms, even though it was one of the ultimate impact items on their logic model. By putting these boundaries in place, NBCFA was able to arrive at a set of realistic outcomes that increased their accountability in an appropriate, effective, and relatively painless way.

In conclusion, using the three systems thinking elements in both of these projects allowed us to better understand the stages at which various stakeholders need to be involved in the evaluation design. It also helped promote buy-in for shared measurement systems that would show where the alliances were making progress and where they could improve. Equipped with the right data in the hands of the right stakeholders, these multi-player initiatives are now well-positioned to impact the systems changes they aim to achieve.

 

[1] Williams, Bob. Systems Thinking. http://betterevaluation.org/en/blog/systems_thinking

[2] NBCFA’s workgroups are organized around the topics of: Advocacy and Policy, Agriculture, Community Engagement, Food Economic Development, and Healthy Food Access.

Share this:
Posted in Evaluation. Bookmark the permalink.
Lisa Frantzen

Lisa Frantzen

Senior Consultant

Lisa’s experience includes working with foundations, nonprofits, governments, and corporations to increase the effectiveness of their social change efforts. Her work has centered on performance measurement and evaluation, strategic planning, and building evaluative capacity within organizations.

At TCC Group, Lisa has led evaluation engagements with clients including the Bill and Melinda Gates Foundation, the Robert Wood Johnson Foundation, Johnson & Johnson, Pew Charitable Trusts, Harvard University’s Center on the Developing Child, Hand in Hand International, Women Deliver, and Princeton in Africa.

Rose Konecky

Rose Konecky

Evaluation Consultant

As an evaluation consultant at TCC Group, Rose Konecky has experience with all stages of evaluation projects, including project design, implementation, analysis, and reporting.

Rose is skilled in the evaluation of various types of organizations, including those focused on capacity building, arts audience development, childhood literacy, and many others. Through her work with the Kate B. Reynolds Charitable Trust, the Program to Aid Citizen Enterprise (PACE), Marin Community Foundation, and the Irvine Foundation, she has proficiency evaluating capacity-building support provided by foundations to a cohort of grantees.

Rose holds an undergraduate degree in Political Science and History, and her graduate degree is in Political Science Data Methodology.