Close

Collections

Navigating complexity: Crafting relevant stories through pragmatism and adapting to the unexpected

Crafting compelling stories that communicate the objectives, methods, results, and recommendations of an evaluation in a way that is easy to understand and memorable is a crucial aspect of evaluation reporting. It often involves the use of real-life examples, anecdotes, and visuals to make the evaluation more relatable and impactful. Traditional research/evaluation approaches often fall short in capturing the complexities, nuances, and intricacies of programs. Complex programs with multiple components often present complex evaluation challenges.

This was the case with the Indiana STEM Teacher Residency project, which aimed to recruit and train culturally competent and highly qualified teachers in STEM subject matters for the diverse context of the Indianapolis Public Schools (IPS) district.

However, our team had to adopt a nimble and flexible approach to evaluate the project due to numerous unanticipated challenges such as the pandemic, low participant recruitment rate, personnel turnover, and the complexity of the context itself. Guided by Crane et al (2018) principles of pragmatism, we utilized theoretical flexibility, methodological comprehensiveness, and operational practicality. Theoretical flexibility enabled us to utilize formative, summative, and developmental techniques, facilitating the use of participatory methodologies that allowed us to account for not only the process and impact measures but also the contextual factors. Methodological comprehensiveness involved the use of multiple methods in both the quantitative and qualitative realms, allowing us to examine different facets and levels of program implementation and beyond meta-narratives, making every voice count. Operational practicality facilitated our ability to communicate real-time and actionable feedback to stakeholders, make considerations for real-life limitations and constraints, and manage the complexity of the project.

In summary, our evaluation effort blended diverse methodologies, arrived at a comprehensive understanding of program intricacies, and effectively communicated findings. It also aided in managing complexity, adapting to unforeseen challenges, and offering real-time feedback, turning obstacles into opportunities. This approach served as the linchpin that enabled us to weave our methodologies and enhance our narrative's relevance and value to stakeholders and the broader educational community.

 

0 comments 0 reposts

Profile picture of Damilola R Seyi-Oderinde

Damilola R Seyi-Oderinde onto AEA 2023

Lessons Learned Using Evaluation to Weave Stories....about the results of a large, interdisciplinary, multinational development project

1 comments 0 reposts

Profile picture of Laura a Warner

Laura a Warner onto AEA 2023

Evaluating Institutional Change Efforts: Evaluators as Story Collectors & Story Analysts

Instruction Matters: Purdue Academic Course Transformation (IMPACT) is a faculty development program that aims to educate instructors about research-based teaching practices that enhance their ability to create student-centered learning environments.

Moreover, IMPACT is based on the understanding that instructor knowledge, implicit beliefs, and motivation for teaching and learning shape course design, implementation, and impact on student learning and success.

Instructors who are supported through peer communities and evidence-based resources will develop motivation to transform their instructions through student-centered practices to achieve equity in student learning and success. 

To acces the ELRC's new logic model (2021) and theory of change as described above, please view the logic model in this post. 

For IMPACT annual reports from 2013 to 2023, please click this link.

0 comments 0 reposts

Profile picture of Alex J France

Alex J France onto AEA 2023

Lending Credibility to Your Evaluation Story with Data Quality and Assurance Approaches

Data quality and data assurance are essential components of all evaluation and research projects.  When working as a single evaluator or part of a unified team, ensuring common understanding of evaluation goals and approaches and fidelity in data collection methodologies can be simple.  However, these activities require greater thought and intentionality as teams get larger and more dispersed.  This poster presents findings and lessons learned from a large, multi-disciplinary, multi-national evaluation in Somalia.  The project included in-country partners and employed data collection enumerators from local study communities.  Language barriers combined with security challenges precluded visits by the US team to aid in enumerator training, observe data collection, or otherwise participate in in-country activities.  Thus, we needed to develop and enact strategies and tools that would minimize risks associated with miscommunication, misunderstanding, or misinterpretation.

0 comments 0 reposts

Profile picture of Ann Bessenbacher

Ann Bessenbacher onto AEA 2023

Staff Photos

0 comments 0 reposts

Profile picture of Laura a Warner

Laura a Warner onto Blog Pictures

Scavenger hunt

1 comments 0 reposts

Profile picture of Laura a Warner

Laura a Warner onto Blog Pictures

Educational Equity in Somalia-(Re)Shaping Evaluation to Inform Policy

0 comments 0 reposts

Profile picture of Laura a Warner

Laura a Warner onto AEA 2022

Readiness, Willingness & Ability Assessment: Creating Opportunities to Foster Multi-Stakeholder Institutional Change

To help instigate institutional change focused on diversity, equity and inclusion, we developed
and piloted a new tool, grounded in organizational psychology and behavior change theory, that
promotes a sound approach to better understand the inclination to institutional change and the
likelihood of using improved practices to instigate change effectively from multiple stakeholder
point of views. The Readiness, Willingness, and Ability (RWA) tool is grounded in the Theory of
Planned Behavior (TPB) (Ajzen, 1985) to examine individuals’ perceptions of the degree that
their colleagues, leadership and institution are positioned for change. The RWA domains were
not designed to act as a psychometric assessment of TPB, instead, it is rooted in TPB and
accommodates needs for practical administration and ease of understanding and application as
an institutional evaluation tool. In this session, we will present our tool, findings from five
institutions and next steps for its use.

1 comments 0 reposts

Profile picture of Lindley McDavid

Lindley McDavid onto AEA 2022

Multi-national evaluation team led by ELRC met in Nairobi, Kenya to analyze Somali education data and formulate recommendations for education programming and education policy.

0 comments 0 reposts

Profile picture of Laura a Warner

Laura a Warner onto AEA 2022

Impact-ful Redesign: Using R Markdown for Evaluative Reporting

0 comments 0 reposts

Profile picture of Kyle Steven Habig

Kyle Steven Habig onto AEA 2022

(Re) shaping Evaluation Reports: Building a Report Template in R Markdown

Project report development can be a resource-consuming endeavor. During this demonstration, participants will learn how to create an R Markdown file in R that integrates both standard reporting text and R code that can be used to create Word, pdf, or HTML file reports at the click of the button.  Attendees will learn how to add visualizations, graphics, statistical analysis, and tables to these reports.  Standard R coding procedures and strategies that we have developed will be shared along with access to the code for everything shared during the demonstration.  Procedures on how to maintain participant anonymity as well as ways to positively highlight subgroup differences will also be shared. Using R Markdown, a free open source program is a great tool to have due to end users’ ability to share code with anyone around the world.  This eliminates the need to pay for expensive software and training.

2 comments 0 reposts

Profile picture of Ann Bessenbacher

Ann Bessenbacher onto AEA 2022

(Re) shaping Evaluation Reports: An Evaluation Report Templating Process

For evaluation centers, project report development can be a resource-consuming endeavor. So, developing a report template is useful for an advanced start on the reporting process, and to create a standardized approach and look.  This collaborative process helps centers prepare a report structure early on in projects using predesigned templates and align data collection and analysis with the report content before data collection begins. This approach streamlines report development by enabling teams to design reports together and then hand them off to a colleague to complete the report independently. These templates can be created using R and R Markdown code (but the process universally applies to any software as well).  Report templates also give centers the information needed to develop dashboards, Adhoc reports, and needs; as well as a means for data team members to create a repository for commonly used scales, data sources, analysis methods, and metadata. 

1 comments 0 reposts

Profile picture of Ann Bessenbacher

Ann Bessenbacher onto AEA 2022

Using Outcome Mapping Tools to Structure Reflection and Learning Opportunities in a Large, Multi-National Development Project

1 comments 0 reposts

Profile picture of Loran Carleton Parker

Loran Carleton Parker onto AEA 2022

AEA 2022 ELRC Presentation Schedule

Catch up with the ELRC Team from Purdue at AEA 2022 for one or more of their presentations:

Poster: Educational Equity in Somalia — (Re) shaping Evaluation to Inform Policy (Christiana Akande)

Presentation and Lead Author 

Date 

Location 

Ignite Session: (Re)shaping Evaluation Reports: An Evaluation Report Templating Process (Ann Bessenbacher)

11/9/22 5:30-6:30pm 

Celestin F 

Ignite Session: Readiness, Willingness & Ability Assessment: Creating Opportunities to Foster Multi-Stakeholder Institutional Change (Lindley McDavid)

11/9/22 5:30-6:30pm 

Celestin F 

Poster: Educational Equity in Somalia — (Re) shaping Evaluation to Inform Policy (Christiana Akande)

11/9/22 6:30-8:30pm 

Elite Hall A - 196 

Poster: Using Outcome Mapping Tools to Structure Reflection and Learning Opportunities in a Large, Multi-National Development Project (Loran Parker)

11/9/22 6:30-8:30pm 

Elite Hall A - 131 

Poster: IMPACT-ful Evaluation: Report Templating Using R Markdown (Alex France)

11/9/22 6:30-8:30pm 

Elite Hall A - 75 

Poster: (Re) shaping Evaluation Reports: Building a Report Template in R Markdown (Ann Bessenbacher)

11/9/22 6:30-8:30pm 

Elite Hall A - 70 

Panel: Using evaluations for deeper understanding of marginalization dynamics: the case of Somalia (Wilella Burgess)

11/11/22 10:30am-12pm 

Celestin G 

Birds of a Feather: (Re)shaping Evaluation Across Borders (Christiana Akande)

11/11/22 

 

1 comments 0 reposts

Profile picture of Ann Bessenbacher

Ann Bessenbacher onto AEA 2022

Menti Meter

Online question system that's interactive.

0 comments 1 reposts

Profile picture of Ann Bessenbacher

Ann Bessenbacher onto Combo Test Collection

Menti Meter

Online question system that's interactive.

0 comments 1 reposts

Profile picture of Ann Bessenbacher

Ann Bessenbacher onto Cool Tools

Data Visualization Checklist from Stephanie Evergreen and Ann K. Emery

This checklist is meant to be used as a guide for the development of high impact data visualizations. Rate each aspect of the data visualization by circling the most appropriate number, where 2 points means the guideline was fully met, 1 means it was partially met, and 0 means it was not met at all. n/a should not be used frequently, but reserved for when the guideline truly does not apply. For example, a pie chart has no axes lines or tick marks to rate. If the guidelines has been broken intentionally to make a point, rate it n/a and deduct those points from the total possible. Refer to the Data Visualization Anatomy Chart on the last page for guidance on vocabulary and the Resources at the end for more details.   by Stephanie Evergreen & Ann K. Emery May 2016

0 comments 1 reposts

Profile picture of Ann Bessenbacher

Ann Bessenbacher onto Combo Test Collection

Evaluation Report Layout by Stephanie Evergreen

This checklist is meant to be used as a diagnostic guide to identify elements of evaluation reports that could be enhanced using graphic design best practices and/or the assistance of a graphic design expert. Suggestions are best suited for those using standard Microsoft Word software.

0 comments 1 reposts

Profile picture of Ann Bessenbacher

Ann Bessenbacher onto Combo Test Collection

likert bar chart

R Visualizations

0 comments 1 reposts

Profile picture of Ann Bessenbacher

Ann Bessenbacher onto Combo Test Collection