Independent external evaluation of the New School's Operating Model - Terms of Reference

The Education Review Office has commissioned an independent external evaluator to help guide the development of our new approach.

Dr Delwyn Goodrick’s evaluation is designed to provide feedback at key phases of the development and implementation process.

 

New School Methodology Approach within ERO

Design of the external evaluation:

External Evaluator- Dr Delwyn Goodrick

September, 2020

 

The purpose of the External Evaluation

An external evaluation is required to provide independent guidance on the implementation of the school evaluation approach. There is value in undertaking a progressive evaluation that tracks alongside implementation. In this way findings derived from data from each phase can be used as a point of reflection and inform improvement. As this approach represents a shift from the traditional approach to reviewing work, it will be important to capture reviewers’ experiences within and across phases, not just at the end.

A key advantage of the approach is that the external evaluator can work alongside the implementation team from ERO. These processes will also be documented as part of the evaluation. With rapid turnaround of feedback from each phase any issues emerging with implementation can be swiftly modified.

The evaluation is based on Developmental Evaluation principles (Patton, 2010). Key elements include sharing of evidence at the conclusion of each phase, critical reflection opportunities, and progressive reporting to ERO.

A reporting phase is proposed at the conclusion of each key stage. These reports are short summary reports that will be aggregated as part of an overall evaluation report.

 

Key Evaluation Questions

The key evaluation questions were developed in consultation with Dr Ro Parsons and Sandra Collins. There may be additional questions that require focus as the evaluation progresses. The key questions orienting this evaluation are:

  1. To what extent does the school education approach support educational improvement? How does the approach foster the schools’ capacity to undertake their own evaluation activities to inform improvement?
    Sub-questions: How does the new school evaluation approach work in practice with schools? (outcome domains: appropriateness, fidelity of implementation, effectiveness). What is the value of the collaborative platform? How did the reporting inform ongoing school improvement? In what ways can the school evaluation approach be improved? What were the key lessons learned?
  2. What were the successes and challenges experienced by Reviewers in implementing the new approach? What facilitated or inhibited implementation of the evaluation approach?
    Sub-questions: What conditions need to be in place for this approach to work (for example, readiness, some stability in structure)
  3. What were key stakeholders (reviewers, schools, ERO) views about the strengths and weaknesses of the approach?
    Sub-questions: From the school perspective - what was the value of the ERO approach in practice? (feasibility, practicality, engagement, application). What were they able to achieve through collaboration with ERO? How does the approach work with different types of schools? (e.g., across school tiers and across diverse cultural and geographical contexts)
  4. What are the necessary tools/components that support consistency of approach across schools? [While the approach is tailored to the school context/needs, there must be a level of consistency in process. The data gathered to address this question will identify the tools/resources that were used during the pilot, and identify additional resources that may support the school improvement approach]

 

Assumptions underpinning the Design of the Evaluation

This EOI is based on the following assumptions:

  • The collaborative base that underpins the school evaluation approach will be mirrored in the evaluation approach. The evaluation will formally begin in September, 2020. The evaluation is based on principles of co-design, and a collaborative platform has shaped this plan. However, Delwyn (as the independent evaluator) will be responsible for all data collection, analysis, synthesis and reporting activities.
  • I understand that fifteen reviewers will participate in the initial roll out of the approach. Each reviewer will work with five schools across the country. They are not new reviewers, and have some experience in undertaking ERO reviews, but they will be trained in a new school evaluation approach. This approach differs from the historical approach to review. While the role of ERO in ensuring accountability remains, reviewers will be expected to work in a more collaborative way with schools to tailor approaches to school context and need. Rather than periodic reviews, reviewers the new approach will foster an ongoing relationship to support school self-evaluation and improvement.
  • The new school evaluation approach has a number of key stages. Reviewers learn about each stage before working with the schools, and then have the opportunity to implement their learnings in their five schools.
  • The evaluation is focused on implementation of the five phases of school evaluation approach, not the design or value of the training that preceded implementation. There may be references to this in evaluation reports, but the focus is on the effectiveness and experience of implementation within the schools.
  • The five phases of the new school evaluation approach are: Engaging, Exploring and Deciding, Designing and Differentiating, Evaluating for Improvement, and Planning and Reporting. ERO Principles of Practice underpin the school evaluation approach. Each phase is distinct, but is intended to work cumulatively to form a strong base of evaluation capacity and support continuous improvement.
  • The fifteen reviewers will begin implementation of the new approach in September-October, 2021. The external evaluator will design feedback mechanisms that allow progressive feedback (e.g., journal templates to facilitate completion by reviewers), and post stage/phase interviews. This means that at the end of implementation of each phase the evaluator will obtain feedback from reviewers about their experience via a group zoom interview. Questions will be tailored to each phase/stage but there will be a focus on eliciting - what worked, what didn’t, challenges, opportunities, breakthroughs (moments of truth, support required to implement this phase well, and adequacy and appropriateness of their preparation).
  • A survey will be developed to document school experience of the entire process. The survey will be sent to all 75 schools with follow up by the evaluator to boost the response rate. Five schools will be selected for a ‘deeper dive’ into their experience and reflection on each stage. The focus here will be on understanding how is the new approach experienced by schools? How is it different from traditional review experiences? What value does this approach offer to schools? What are the strengths and weaknesses? The five schools will be selected on the basis of purposive sampling, and a maximum variation sampling strategy (using dimensions such as primary/secondary/ urban rural) will be adopted to ensure a range of schools participate in the evaluation. Alternatively, key schools may be selected on the basis of emerging success or challenges or other relevant criteria.
  • While the short implementation timeframe inhibits a full Developmental Evaluation (DE) Approach, there are a number of principles of DE that will be adopted in this evaluation. Key elements of DE and implications are outlined below.

 

Overall Approach

Developmental Principles Underpinning the Evaluation What does this mean in practice?
Feedback over time about ERO approaches, learnings, and promising practice to support direction and/or affirm changes in direction Design, methods of data collection/tools and reporting requirements are developed in collaboration with ERO
Documentation of emerging activities that are developed to address needs of schools It will be important to document strategies that reviewers use to be situationally responsive within schools. There is an opportunity to learn what works in particular contexts.
Sharing finding with reviewers, managers and design leaders to facilitate learning The reviewers will share activities, experiences and learnings with the schools, and this information will be synthesised and shared in internal reports. Opportunities for reflection as a critical part of learning will be explicitly incorporated within co-design workshops/forums
Mixed methods of data collection are valuable in generating an understanding Qualitative and quantitative methods will be implemented to make sure activities and outputs are picked up, as well as signs of promising practices that contribute to outcomes.
Recognition that the projects/approaches are dynamic not static and designed to be adaptive to the context The evaluation must generate an understanding of the way contextual differences influence approaches undertaken with schools. However, it must also identify critical processes (approaches, tools and resources) that facilitate engagement of schools with self-evaluation.

Design and Methods

The evaluation will incorporate mixed methods, including interviews and surveys relevant for each phase of the school evaluation approach. Table 1 outlines data collection by phase.

  • Development of the theory of change, identification of boundary and scope, facilitating/participating in progress meetings, development of the evaluation plan, and progressive reporting requirements
  • Assessment and or synthesis of secondary documentation about the content/development of the approach, key papers and articles, and other information prepared by ERO facilitators/leaders
  • Development (piloting), collation and analysis of one school survey to document the perspectives of school stakeholders - Progressive collation, analysis and feedback to ERO for progress reports/briefings
  • Secondary analysis of any reviewer feedback mechanisms (e.g., Journal feedback, questions asked post PD)
  • Interviews with school representatives from five schools (school level unit of analysis) across the country
  • Individual semi-structured interviews with Managers (x 3) - Small group interview with the Design and Implementation team
  • Group Interview with reviewers on Zoom at the end of the entire school evaluation process (at the end of the pilot) and an evaluation feedback forum to profile learnings and sharpen recommendations.

 

Table 1: Evaluation Data Collection Matrix

Key Questions Information requirements Methods Source Timing of data collection, analysis and progress reporting
1. To what extent does the school education approach support educational improvement? How does the approach foster the schools’ capacity to undertake their own evaluation activities to inform improvement? 1a. Underlying theory behind the approach (rationale and existing feedback from schools on their needs) 1b. What was the nature of the improvements made in the schools that can be attributed to school participation in the pilot? 1a. Del to draft from existing documentation and discussion with Ro/Sandra and other approach architects 1b. School survey (near the end of the Pilot) 1b. Interviews with school representatives from a purposive sample of five schools 1b. reflections from reviewers (observation/noticing/ documenting) 1a. Documentation within ERO 1a. Approach ‘architects’ 1b. School stakeholders 1b School participants (key informants within each of the 5 schools) 1c. Reviewers (15) 1a. Before Phase 1 implementation commences 1b. Near the end of the pilot 1b. Progressively across key phases and cumulatively at the end
2. What were the successes and challenges experienced by Reviewers in implementing the new approach? What facilitated or inhibited implementation of the evaluation approach?

2a. Number of schools that participated/engaged and sustained engagement across the five stages

2b. Assessment of school readiness and level of engagement

2c. Appropriateness of reviewers’ preparation for implementation

2d. What did reviewers find helpful/not helpful/What worked or didn’t work in a practical sense?

2a. Simple count across phases and schools and feedback from reviewers

2b. Group feedback (Zoom) at the end of each session to elicit views about their experience of initial engagement (and subsequent engagement at the conclusion of each phase)

2b. Interviews with Managers

2c. Group feedback (zoom) and individual accounts (sent directly to Del)

2d. Short template feedback form (Del to design) at the end of each phase/Group feedback

2a. Administrative data and group feedback

2b. All reviewers (15)

2c. All reviewers (15)

2b. Managers (x3)

2d. All reviewers

 
3. What are the necessary tools/components that support consistency of approach across schools?

3a. What tools did the reviewers use? How did they work with the schools? What were the gaps in existing tools?

3b. Which tools/materials are necessary/key to each stage?

3c. Which approaches did not work so well?

3a. Desktop review of the tools/materials used as prompts by reviewers according to each stage.

3b. Content analysis of tools and focus (used by each reviewer) and feedback about those tools

3c. Feedback (Zoom) and

3a. ERO Administration

3b. Reviewers perspectives on the gaps in tools/School feedback about the tools

3c. Reviewers perspectives on the tools used in each phase/School feedback about the tools

In last phase of pilot

3b. Progressively

3c. Progressively

4. What were key stakeholders (reviewers, schools, ERO) views about the strengths and weaknesses of the approach?

4a. Time commitment and perceived burden

4b. Appropriateness of the focus given school context

4c. Relational dimensions: What are the attributes of reviewers who do this well? What can be learned in terms of what skills are required for this work?

4d. How did the work with ERO reviewers shape school improvement planning/strategic planning?

4e. What improvements in the process would facilitate better engagement from schools (less directive/more directive/provision of tools/broad approaches)

4a/4b/4c/4d/4e. Interviews with school stakeholders at the five deep dive schools

4a/4b/4c/4d/4e. School survey

4a/4b/4c/4d/4eGroup zoom reflective discussion with reviewers on these dimensions

4a/4b/4c/4d.Five schools- deep dive

4b/4c/4d. All schools (75) (represented stakeholders to be identified)

Reviewers (15)

Managers (x3)

Near the end of the pilot

4b/4c/4d. End of phase 5

5. Overall: What were the lessons learned? How can the process (approach, staging and materials) be improved? A synthesis of key messages/learnings from implementation of the new approach

-Complete data set- progressive feedback from reviewers/interviews with school stakeholders/ review of templates and feedback provided

-Evaluation Feedback Forum

Collation of all materials and data gathered during the evaluation

Participants at the Evaluation feedback forum

A succinct final report on the pilot. The progressive reports will be included as an appendix and referred to as appropriate in the main report.

Table 2 presents the deliverables aligned to key stages of implementation, and dates. I welcome feedback on the appropriateness of the phasing of the external evaluation.

Table 2: Implementation, Evaluation Stages and Deliverables

Evaluation Stage and alignment to School Approach External Evaluation Activity Deliverable(s)
External Evaluation Inception Stage

Develop evaluation plan from EOI

Share evaluation purpose, process and plan with the reviewers for comment (to model a participatory process

1. The external evaluation plan

2. Introduction letter to reviewers outlining the approach to the external evaluation

Feedback from reviewers on first two stages - implementation, Experiences and challenges Aligned to stages 1 and 2.

Del facilitates post phase reflection activity on Zoom with the reviewers (questions to be designed)

Reviewers are asked to complete a short feedback template

3. Questions for Zoom feedback/reflection

4. First Group Zoom feedback meeting with reviewers (one hour post existing meeting or scheduled independently)

Exploring (stage 1) and Deciding Stage (stage 2 (reviewers implementation with schools)

-Drafting items for school survey

-Crafting short progress report on first zoom reflection

5. Short report from initial reflections (Zoom meeting) on stages 1 and 2
Formal Review of Engaging (1), Exploring and Deciding stages

-Interview with managers and design team

-Template reports distributed to reviewers

6.Interviews with managers and design team

6a. Documentation of discussion and implications for feedback

6b. Criteria and selection of 5 deep dive schools (jointly decided with ERO)

Designing and Differentiating (Stage 4) A key stage in the implementation of the new school approach 7.Zoom feedback and template completion
Evaluating for Improvement (Stage 5)    
Data collection across schools and with 5 case profile schools 8.Survey –of all schools 8.School survey
Synthesis and reporting (External Evaluation) At the conclusion of phase 5 9.Forum facilitation- agenda prep and summary document
Synthesis and reporting (External Evaluation) Analysis and synthesis of all data – survey of schools, 5 school profiles, reviewers’ feedback and commentary, review of secondary documentation 10.Final evaluation report with progress reports embedded or as appendices (one month post conclusion of pilot)