Skip to content

2016 American Evaluation Association Annual International Conference

Wednesday October 26


4:30-6:00pm: AEA GEDI 2016 Panel: Lessons Learned on Culturally Responsive Evaluation

The Graduate Education Diversity Internship (GEDI) scholars will discuss lessons learned on cultural competence and culturally responsive evaluation (CRE) from their year-long program experiences. Panelists will discuss learning opportunities in CRE provided by the GEDI program, how they were able to apply this learning to work at their internship sites, as well as ways in which CRE can be incorporated in evaluation design more generally. The panel also provides scholar’s thoughts on their group project, which analyzed the ways AEA 2015 conference participants understood and practiced CRE in their own evaluation work. In all, these discussions highlight both positive experiences and challenges interns faced at sites throughout the United States that contributed to their learning and development as evaluators.


Presenters
  • Ashaki M. Jackson, Consultant, Graduate Education Diversity Internship (GEDI) (Chair)
  • Jamie Lynn Vickery, Graduate Research Assistant, Natural Hazards Center–CU Boulder
  • Angela Mendoza
  • Thana-Ashley Charles, Consultant, TCC Group

 

Thursday, October 27


7:00-7:45am: PTTDE2: Program Theory as a Tool for Program Design


One barrier to viable program evaluations is that development programs are based on assumptions that often are not well articulated. In designing programs, stakeholders often lack clear outlines for how implemented interventions will bring desired changes. This lack of clarity masks critical risks to program success and makes it challenging to evaluate such programs. The conversation on examining program assumptions needs to start with a typology that clarifies various categories of assumptions and differentiate those that are worth examining from those that are not. I propose and illustrate a typology of five critical program assumptions-normative, diagnostic, prescriptive, causal, and external assumptions. I suggest a method for mainstreaming assumptions’ inquiry in the broader evaluation design.

Presenters
  • Charles Gasper, Senior Consultant, TCC Group (Chair)
  • Apollo Nkwake, AWARD/CGIA
  • Huey T. Chen, Professor – Mercer University
  • Jessica Ozan, Research Associate, Manchester Metropolitan University

 

8:00-9:30am: 1285: Advocacy as a Team Game: Methods for Evaluating Multi-Stakeholder Advocacy Efforts

In today’s complex advocacy environment, it is rare that a single organization can pursue goals on its own. Rather, organizations generally work together, sometimes in very coordinated campaigns and sometimes in less formal networks. Consequently, evaluators have to deal not only with the questions related to the advocacy outcomes but also the particular roles and contributions of multiple partners that work directly together on some activities and in parallel on others. This session will build on last year’s well-attended and positively reviewed session, beginning with an introduction to the complex issues that must be considered when designing and deploying evaluation methods in multi-stakeholder environments. The speakers will then share specific methods in the context of case studies, including network analysis, field mapping, machine learning in media analysis, and consensus tracking tools and dashboard. The session will end with a discussion of when different methods apply and their strengths and limitations.


Presenters
  • Jewlya Lynn, CEO, Spark Policy Institute (Chair)
  • Jared Raynor, Director, Evaluation, TCC Group
  • Rebecca Ochtera, Senior Researcher, Spark Policy Institute
  • Anne Gienapp, Senior Affiliated Consultant, ORS Impact

 

11:00-11:45am: 1735: Designing a New Community of Practice: Celebrating 10 years of Advocacy and Policy Change TIG through reflections from past and present TIG chairs


Ten years ago this AEA, a small group of people convened to formally complete the process for establishing a new topical interest group (TIG): Advocacy and Policy Change. Since then the TIG saw significant growth in membership and interest in the topic. As a field, we have progressed significantly, with a wide range of ideas, tools, approaches, and a forthcoming book on the topic. What does it mean to design a new field of practice? This session will share reflections from the past and current co-chairs of the TIG—what is the state of the field? What did building a field actually look like? Have we arrived as a clear community of practice? The audience will be included in sharing their own perspectives and ideas for the future.

Presenters
  • Jared Raynor, Director, Evaluation, TCC Group
  • Julia Coffman, Director, Center for Evaluation Innovation
  • Jackie W Kaye, Wellspring Advisors
  • Annette Lenore Gardner, Assistant Professor, U of California, San Francisco
  • David Devlin-Foltz, Vice President, Impact Assessment and Executive Director, The Aspen Institute

 

Friday, October 28

8:00-9:30am: 1831: Designing a partner-centered ECB initiative: 360° perspectives from a corporate philanthropy program

As demand for accountability in grantmaking increases, many funders are turning to evaluation capacity-building (ECB) initiatives to fill the gap between funders’ expectations and nonprofits’ ability to evaluate grant results.

When it comes to designing ECB activities, “right-sizing” to partners’ specific needs is recommended. For Johnson & Johnson’s corporate philanthropy, designing a successful ECB initiative meant harnessing the partner-centered approach that is cornerstone to its grantmaking strategy. The ECB projects were tailored to partners’ needs based on a multi-pronged assessment process, which allowed each grantee the opportunity to suggest what their needs were.

Using this recent experience of J&J, this panel proposes to describe the design and implementation of the partner-centered ECB initiative; to share findings on overall effectiveness; and to discuss linkages to the broader ECB field. By convening the funder, nonprofit partners, and evaluators involved, this diverse panel will advance the conversation around a topic of growing importance.


Presenters
  • Michael Bzdak, Director, Corporate Contributions, Johnson & Johnson (Chair)
  • Laura Hollod, Measurement & Evaluation Specialist, Johnson & Johnson Corporate Contributions
  • Julie Solomon, Principal & Member, J. Solomon Consulting, LLC
  • Lisa Frantzen, Senior Evaluation Consultant, TCC Group

 

3:30-4:15pm: 2690: Measuring Innovative Movement-Building


For decades, evaluators have supported traditional movement-building and advocacy interventions by generating evidence around knowledge, attitudes, and behavioral changes.

The advent and popularization of entertainment education (‘edu-tainment) has, however,  taken the movement-building field to a new level. Successful examples of this are Population Foundation of India’s Drama Series Main Kuch Bhi Kar Sakti Hoon (I, A Woman, Can Achieve Anything) and Institute for Health and Communications’ Soul City. By integrating social messages into popular entertainment programs, these initiatives have seeded new ideas and shifted public opinion on issues related to reproductive health, gender, and HIV/AIDS.

The convergence of mass media, information communications technology, and popular culture within the entertainment education field presents both a challenge and opportunity for the evaluation community.  This panel will focus on how to build on existing practice for evaluating movement-building interventions and determine how this may be applied to evaluation of the ‘edu-tainment’ space.

 

Presenters
  • Nancy MacPherson, The Rockefeller Foundation (Chair)
  • Charles Gasper, Senior Consultant, TCC Group

 

Saturday, October 29

9:45-10:30am: 1872: Are we there yet? Applying Rapid Cycle Learning Methods to Evaluation within a Foundation’s Program Design


Following four decades of leadership development programming and a comprehensive strategic planning process, the Robert Wood Johnson Foundation (RWJF) made the decision to restructure its well-known leadership development programs. As part of the new Culture of Health vision and the foundation-wide emergent strategy approach, the new programs are being co-created with grantees, advisors, staff, and other stakeholders, and a “learning-as-we-go” approach has been adopted for constant refinement.

Rather than waiting for program implementation to do evaluation, RWJF decided to take the innovative approach of evaluating the program strategy and design process itself. This required designing rapid and intentional iterative learning loops to understand if program development components aligned with established goals, why results occur, and what can be improved upon in the future. This presentation explores why this approach was chosen, the tools used, the pitfalls encountered, and how the approach set the stage for evaluation during the implementation phase.

Presenters
  • Denise E Herrera, Robert Wood Johnson Foundation
  • Lisa Frantzen, Senior Evaluation Consultant, TCC Group
  • Jared Raynor, Director, Evaluation, TCC Group
10:45-11:30am: 1871: Making the Most of a Logic Model: Demonstrating New Methods to Increase Use and Learning through Logic Models

Logic models are tested and understandable formats for visualizing implementer’s programs and goals.  But what else can logic models do—beyond providing a useful starting point for evaluation planning? This demonstration will highlight three methods to increase the value of logic models to clients. First, Model Heat Mapping will demonstrate logic models being used to tie a funder’s financial support to model components. Second, Cohort Logic Modeling will illustrate how logic models can be used across a funding portfolio to create a shared vision among grant partners and to inform development of evaluation tools. Third, the Change over Time method will demonstrate how logic models can display outcome progress over time. Demonstrators will share experiences and guidance in utilizing these methods. The session will increase participants’ understanding of three new ways to foster learning with traditional logic models.

Presenters

Rose Kowalski, Evaluation Consultant, TCC Group
Katherine Locke, Associate Director, Evaluation, TCC Group
Deepti Sood, Senior Evaluation Consultant, TCC Group

Host

American Evaluation Association

Stay Updated

Join our email list to stay updated with TCC Group’s practices, tools, and resources.