Issue link: https://takingitglobal.uberflip.com/i/1542824
R. Adeboye, C. Flewelling,V. Ogbole, E. O'Sullivan 14 ● Interviews with Connected North Staff were important, since staff are the nexus between how the program is designed theory and how it is realized in practice. Relevant interview questions broadly explored how staff support program implementation from recruitment to session design and delivery. ● Interviews with Content Providers explored how sessions are designed and delivered in practice, including the considerations guiding session development, the support received from Connected North, and the factors influencing effective delivery and follow-up across different contexts. ● Interviews with Teachers explored the extent to which sessions exhibit the intended key program features, including teachers' own roles in making sessions a success. Interview data was supplemented by a content analysis of write-in responses from the Teacher and Content Provider Surveys, as well as quantitative indicators drawn from rating scales on those surveys. 5. Does the program appear to be contributing to its intended short- and medium-term outcomes? Acknowledging that challenges in attributing changes in student outcomes to participation in Connected North was the main impetus behind adopting the Contribution Analysis approach, this question examines what the evidence says about whether Connected North can be seen to be influencing student outcomes, particularly the short- and medium-term outcomes on which teachers are more likely to have a line of sight. Four lines of evidence provided insight into this question: interviews with teachers; content analysis of the write-in questions on the teachers survey; and quantitative analysis of the rating scale responses on the teacher and student surveys. As a secondary line of inquiry, the evaluation also examined contextual factors identified in the Program Theory Assumptions, specifically whether Connected North appears to be similarly effective for students with different characteristics (e.g., age) and across diverse subject areas (e.g., mathematics, arts). Lines of evidence included a literature review on differential effects of enriching learning experiences (remote or live) across student populations, as well as teacher interviews and content analysis of write-in survey responses to assess whether educators had observed such differences in practice. Notably, the evaluation did not address the Program Theory Assumption regarding whether the effect of the program is linear - i.e., that there is no minimum number of sessions before effects begin, and no maximum number after which point effects would decline 4 . 4 The evaluation originally set out to try to explore this question through teacher interviews, but the questions did not elicit sufficient data for analysis.

