Part 5: Monitoring the support for, and progress of, students

Ministry of Education: Managing support for students with high special educational needs.

5.1
In this Part, we set out our findings about how the Ministry:

Summary of our findings

5.2
The Ministry adequately monitors the quality and consistency of the support services it provides through the four initiatives against its Specialist Service Standards, and through client satisfaction surveys. The Ministry has a range of peer review processes, and gains feedback from the community through stakeholder reference groups and, in some districts, through other community groups. The Ministry tracks the progress of individual students receiving support through the Individual Education Programme plan and Individual Care Plan processes. However, the Ministry is not able to review or evaluate the effectiveness of its support for students at an aggregated national level. The Ministry has identified the need for better systems to aggregate information.

5.3
We have made one recommendation in this Part, for the Ministry to further improve its systems for monitoring students’ progress at an aggregated level.

Monitoring the support provided to students

The Ministry monitors the quality and consistency of the support it provides to students through the four initiatives. It has various monitoring activities to measure performance against its Specialist Service Standards, and uses peer review processes.

All four initiatives

5.4
The Specialist Service Standards include the expectation that all Ministry staff will comply with professional standards. The Ministry’s service managers carry out a range of activities to check their staff’s compliance with the Specialist Service Standards, including selected and random reviews of individual student files. Service managers report the results of these reviews to district managers. In all the district offices of one region, service managers require staff to complete a certain number of student file reviews each school term.

5.5
Other monitoring activities include discussions about individual students during professional supervision sessions and at team meetings, assessing the results of and responding to client satisfaction surveys, and informal feedback from schools and parents/caregivers. In some districts, compliance with the Specialist Service Standards is examined during performance reviews of the staff.

5.6
The Ministry also has peer review processes to monitor the quality and consistency of support provided to students, including the national Review of Individual Behaviour Service (RIBS) process. All individual student behaviour cases that are open are eligible for these processes. The RIBS process allows specialists working with students with behavioural issues to come together at least once a term for an in-depth, practical discussion of common issues and concerns.

5.7
Districts we visited as part of the audit varied in the regularity of the RIBS process, with one district having a minimum of one RIBS completed for any case where a child had been receiving behaviour support for more than three school terms. The Southern region has adapted the RIBS process to measure the quality of services provided across all of its services and staff. This is known as the Review of Support Services.

Receiving feedback on the support provided to students

For all four initiatives, the Ministry obtains feedback from parents/caregivers and educators about the support it provides to students. Recent client surveys have noted a high degree of satisfaction from parents/caregivers and educators about the Ministry’s support for students.

All four initiatives

5.8
The Ministry carries out client satisfaction surveys each year, and in some districts more frequently. The surveys ask parents/caregivers and educators about how the Ministry delivered the service, including the accessibility of staff, how staff communicated and shared knowledge with them, the guidance provided, how the Ministry kept parents/caregivers and educators informed throughout the process, and whether the support made a positive difference.

5.9
The districts we visited carried out their client satisfaction surveys at different times of the year from each other, used different survey formats, collected different data, and collated and presented that data differently. While the surveys provided useful feedback for districts, it would be more helpful to have standardised formats and procedures, to allow easier collation of data at a national level.

5.10
The Ministry reported in 2007 that 10% of all parents/caregivers were given the opportunity to complete a survey, of which 82.3% responded. About 80% of the respondents thought that the Ministry’s services were timely and made a positive difference to their child, and that the Ministry kept them informed in an appropriate manner. Educators we talked to as part of the audit commented that the Ministry’s service delivery could vary, and depended on the Ministry staff involved. Some of the Ministry’s staff were considered excellent.

5.11
The Ministry has told us that a national survey was also completed in 2008/09, and that for future surveys it plans to use the State Services Commission’s Common Measurement Tool, which is designed to allow comparison of results between government agencies and also with overseas agencies.

5.12
Some district offices we visited have established stakeholder reference groups, involving representatives from the district office and parents/caregivers and educators, to obtain feedback on Ministry policies and practices. The feedback is used to inform practices and to seek options and solutions for particular problems. There are also meetings held at times with local cultural community groups such as the Pasifika fono in North West Auckland district, and meetings with a Pasifika church group in Canterbury.

5.13
The Ministry gathers informal feedback regularly from its contact with parents/caregivers and educators, and from any complaints. Complaints are dealt with by the service or district managers. The IEP plan and the ICP process (discussed in paragraphs 5.14-5.18) also provide an opportunity for regular feedback from parents/caregivers about their child’s support.

Monitoring the progress of students

The Ministry regularly monitors the progress of students who are provided support through the four initiatives, through the Individual Education Programme plan or Individual Care Plan processes.

All four initiatives

5.14
Individual Education Programme (IEP) plans are intended to support students receiving ORRS, Severe Behaviour Initiative, and Speech Language Initiative support. Students receiving ORRS support are required to have a separate service agreement that links their identified needs with their IEP. Individual Care Plans (ICPs) are intended to support students receiving support through the School High Health Needs Fund.

5.15
IEP plans provide guidance for each student’s individual programme for a defined period, outlining the student’s skills and needs, and identifying achievement objectives and goals. The IEP plan is a tool for collaborative planning and assigning responsibilities between the Ministry, school, parents/caregivers, students (where appropriate), and other agencies where necessary. The IEP guidelines recommend that the plan be reviewed every term, or according to the needs of the student and any changes in circumstances.

5.16
Schools are responsible for reviews of a student’s IEP plan. The reviews are carried out by the wider support team for a student, and assess progress toward the outcomes that were identified in the initial assessment. Individual reviews are done in a variety of ways in different districts, including measuring against the key competencies in the New Zealand Curriculum, and using as a basic structure the Ministry’s priorities of presence, participation, and learning.

5.17
The ICP specifies the care and supervision tasks the teacher aide will carry out, the monitoring system to ensure that the care remains appropriate to the student’s needs during the year, and the evaluation process to determine the student’s ongoing level of need for care and supervision.

5.18
Like IEP plans, ICPs are a tool for collaborative planning and assigning responsibilities between the Ministry, school, parents/caregivers, students (where appropriate), and other agencies and specialists where necessary. ICPs are reviewed each year, or according to the needs of the student and any changes in circumstances, and are the responsibility of the school. The reviews are carried out by the student’s wider support team, and assess progress toward the outcomes identified in the ICP.

Using monitoring information to inform planning

The Ministry uses information gained from its monitoring and from feedback to review, evaluate, and inform its support for individual students. However, the Ministry does not have the systems in place to review or evaluate the effectiveness of its support for students at an aggregated level.

All four initiatives

5.19
The Ministry has noted that the highly individualised nature of each student’s needs and circumstances, and consequently their IEP plan or ICP, makes it difficult to aggregate and analyse results from the IEP/ICP process about the effectiveness of the support provided.

5.20
The Ministry is trialling a goal attainment scaling tool (an evaluation technique that involves preparing an outcome scale to measure individual or group progress towards achieving identified goals). It will use this tool to aggregate the evaluative outcomes data from the IEP/ICP process. The Ministry will use the tool to help assess the effectiveness of its services.

5.21
The Ministry has carried out other reviews of particular aspects of its support for students. Nationally, the Ministry is reviewing the Specialist Service Standards. The 2008 resourcing survey provided feedback from schools about the effectiveness of services, noting a rating of “high” for more than half of the students in the survey. At a district level, the Ministry has carried out reviews of communication support workers, client files, specialist services, and processes for allocating services.

5.22
The Ministry told us that it has several separate databases for client and funding data. This has made it difficult for the Ministry to accurately track the services provided to a particular student because the databases are not able to share and aggregate outcome data. Also, inconsistent data entry and repeated entries of particular students into one or more of the databases has made it difficult to identify students and track services. The Ministry has been working to improve regional practices in entering and maintaining accurate data.

5.23
After our audit, the Ministry told us that it was working to ensure easier monitoring of students and services, and had gained funding approval to put in place a “business data warehouse” system. The Ministry also told us that it was working on a “Student support interventions” project, with indicators that are intended to provide information about the effectiveness of its interventions.

5.23
The Ministry’s new client filing format, designed to provide greater national consistency in the recording and collation of student information, has been implemented during the last two years. However, at the time of the audit, the Ministry noted that most of the feedback it received about the support for individual students was provided on paper, and the Ministry did not have systems that can readily collate this feedback for monitoring purposes. This means that the Ministry was not able to use the feedback to review or evaluate the effectiveness of its support for students at an aggregated level. The Ministry is investigating a case management system that will allow it to aggregate data and gain a complete picture of the outcomes of its service delivery.

5.24
In our view, the Ministry needs to improve its systems to gather and aggregate information about students so it can review and evaluate the overall effectiveness of the support the Ministry provides. Without reviewing and evaluating the overall effectiveness of its support, the Ministry cannot easily plan programmes and procedures it knows to be successful.

Recommendation 10
We recommend that the Ministry of Education improve its systems to gather and aggregate information about the effectiveness of its support for students.
page top