Close

Collections

Navigating complexity: Crafting relevant stories through pragmatism and adapting to the unexpected

Crafting compelling stories that communicate the objectives, methods, results, and recommendations of an evaluation in a way that is easy to understand and memorable is a crucial aspect of evaluation reporting. It often involves the use of real-life examples, anecdotes, and visuals to make the evaluation more relatable and impactful. Traditional research/evaluation approaches often fall short in capturing the complexities, nuances, and intricacies of programs. Complex programs with multiple components often present complex evaluation challenges.

This was the case with the Indiana STEM Teacher Residency project, which aimed to recruit and train culturally competent and highly qualified teachers in STEM subject matters for the diverse context of the Indianapolis Public Schools (IPS) district.

However, our team had to adopt a nimble and flexible approach to evaluate the project due to numerous unanticipated challenges such as the pandemic, low participant recruitment rate, personnel turnover, and the complexity of the context itself. Guided by Crane et al (2018) principles of pragmatism, we utilized theoretical flexibility, methodological comprehensiveness, and operational practicality. Theoretical flexibility enabled us to utilize formative, summative, and developmental techniques, facilitating the use of participatory methodologies that allowed us to account for not only the process and impact measures but also the contextual factors. Methodological comprehensiveness involved the use of multiple methods in both the quantitative and qualitative realms, allowing us to examine different facets and levels of program implementation and beyond meta-narratives, making every voice count. Operational practicality facilitated our ability to communicate real-time and actionable feedback to stakeholders, make considerations for real-life limitations and constraints, and manage the complexity of the project.

In summary, our evaluation effort blended diverse methodologies, arrived at a comprehensive understanding of program intricacies, and effectively communicated findings. It also aided in managing complexity, adapting to unforeseen challenges, and offering real-time feedback, turning obstacles into opportunities. This approach served as the linchpin that enabled us to weave our methodologies and enhance our narrative's relevance and value to stakeholders and the broader educational community.

 

0 comments 0 reposts

Profile picture of Damilola R Seyi-Oderinde

Damilola R Seyi-Oderinde onto AEA 2023

Lessons Learned Using Evaluation to Weave Stories....about the results of a large, interdisciplinary, multinational development project

1 comments 0 reposts

Profile picture of Laura a Warner

Laura a Warner onto AEA 2023

Evaluating Institutional Change Efforts: Evaluators as Story Collectors & Story Analysts

Instruction Matters: Purdue Academic Course Transformation (IMPACT) is a faculty development program that aims to educate instructors about research-based teaching practices that enhance their ability to create student-centered learning environments.

Moreover, IMPACT is based on the understanding that instructor knowledge, implicit beliefs, and motivation for teaching and learning shape course design, implementation, and impact on student learning and success.

Instructors who are supported through peer communities and evidence-based resources will develop motivation to transform their instructions through student-centered practices to achieve equity in student learning and success. 

To acces the ELRC's new logic model (2021) and theory of change as described above, please view the logic model in this post. 

For IMPACT annual reports from 2013 to 2023, please click this link.

0 comments 0 reposts

Profile picture of Alex J France

Alex J France onto AEA 2023

Lending Credibility to Your Evaluation Story with Data Quality and Assurance Approaches

Data quality and data assurance are essential components of all evaluation and research projects.  When working as a single evaluator or part of a unified team, ensuring common understanding of evaluation goals and approaches and fidelity in data collection methodologies can be simple.  However, these activities require greater thought and intentionality as teams get larger and more dispersed.  This poster presents findings and lessons learned from a large, multi-disciplinary, multi-national evaluation in Somalia.  The project included in-country partners and employed data collection enumerators from local study communities.  Language barriers combined with security challenges precluded visits by the US team to aid in enumerator training, observe data collection, or otherwise participate in in-country activities.  Thus, we needed to develop and enact strategies and tools that would minimize risks associated with miscommunication, misunderstanding, or misinterpretation.

0 comments 0 reposts

Profile picture of Ann Bessenbacher

Ann Bessenbacher onto AEA 2023

Staff Photos

0 comments 0 reposts

Profile picture of Laura a Warner

Laura a Warner onto Blog Pictures

Scavenger hunt

1 comments 0 reposts

Profile picture of Laura a Warner

Laura a Warner onto Blog Pictures

Educational Equity in Somalia-(Re)Shaping Evaluation to Inform Policy

0 comments 0 reposts

Profile picture of Laura a Warner

Laura a Warner onto AEA 2022

Readiness, Willingness & Ability Assessment: Creating Opportunities to Foster Multi-Stakeholder Institutional Change

To help instigate institutional change focused on diversity, equity and inclusion, we developed
and piloted a new tool, grounded in organizational psychology and behavior change theory, that
promotes a sound approach to better understand the inclination to institutional change and the
likelihood of using improved practices to instigate change effectively from multiple stakeholder
point of views. The Readiness, Willingness, and Ability (RWA) tool is grounded in the Theory of
Planned Behavior (TPB) (Ajzen, 1985) to examine individuals’ perceptions of the degree that
their colleagues, leadership and institution are positioned for change. The RWA domains were
not designed to act as a psychometric assessment of TPB, instead, it is rooted in TPB and
accommodates needs for practical administration and ease of understanding and application as
an institutional evaluation tool. In this session, we will present our tool, findings from five
institutions and next steps for its use.

1 comments 0 reposts

Profile picture of Lindley McDavid

Lindley McDavid onto AEA 2022

Multi-national evaluation team led by ELRC met in Nairobi, Kenya to analyze Somali education data and formulate recommendations for education programming and education policy.

0 comments 0 reposts

Profile picture of Laura a Warner

Laura a Warner onto AEA 2022