Catch Up Monitoring Systems
Catch Up was piloted by the Zambian Ministry of General Education, with support from the J-PAL Africa policy team along with Pratham, Innovations for Poverty Action ZambiaUNICEF ZambiaVVOB – education for development, and ZESSTA.

Catch Up was successfully implemented and will be scaled up to approximately 1,800 schools across Zambia. The Catch Up monitoring system relied on government officials at the school, zone, and district level. Additional mentors from VVOB – education for development were provided to strengthen the mentoring and monitoring system. This case study was drawn from observations and process monitoring conducted by Innovations for Poverty Action (IPA) Zambia and J-PAL Africa during the Catch Up pilot. The monitoring system and tools implemented during the Catch Up pilot in Zambia serve as examples and should not be replicated directly in other contexts. The current tools are being re-evaluated and improved for scale-up. When setting up monitoring systems, governments and implementers should carefully consider their specific contexts and constraints.

Edit Content
A child uses chalk to write on the wall during a TaRL reading activity in Karnataka, India.

Team Structure

Click on the button below to see an explanation of the Karnataka government’s monitoring structure and their roles and responsibilities for the table.

Pratham also deployed a team to support the program. At the state level a senior Pratham personnel was responsible for overall coordination with DSERT in consultation with the state team members – 3-4 persons responsible for training support, content creation and monitoring; and 1 measurement and monitoring associate responsible for coordinating data collection and analysis. In every district there were 4-5 district in-charges (which got reduced to 1-3 in 2017-18) who coordinated with the government officials at the district level and below to support and plan the activities in their district.

Edit Content

Two key pieces of information were collected:

1.)  Information from schools about children’s learning levels and attendance

      School teachers recorded children’s attendance and assessment data. CRPs aggregated data from the schoolsBlock level data entry personnel collected these sheets from all CRPs in the block and entered data onto a data entry portal created by Pratham. District officials tracked data entry status on the portal and followed up with blocks to ensure timely data entry. Pratham created dynamic data visualizations, which were available to everyone on a website to facilitate timely decision-making.

2.)  Information from mentors about TaRL classroom visits

CRPs recorded their classroom observations in a sheetThese sheets remained with them to facilitate discussions during the review meetings at the block level. Even at higher levels, the focus was on discussing the observations in review meetings. The observation data was not entered.

The forms and processes are explained in detail below.

Information From Schools

The following processes were followed to get the data quickly collected, entered, analysed and reported. People were thoroughly trained on these forms, systems and processes

a) Teachers used the Learning Progress Sheetto record child-wise assessment and attendance data. The sheet was designed such that teachers could fill it easily. After completing the one-to-one assessments, teachers aggregated the data, which gave them a clear picture of the learning levels of their class.

b) CRPs visited all schools within 3-4 days after the assessments to consolidate the data in the Cluster Consolidation Sheet. Note that the data was recorded grade-wise for every school. This sheet provided a summary of all schools a CRP was responsible for.

To increase engagement with data, the CRPs were also provided with a sheet to visualize the assessment data and to prioritize and plan their support to schools.

c) The Cluster Consolidation Sheetswere submitted at the government block offices for data entry. Pratham designed an online data entry portal with separate login accounts for every block. The portal was simple to use and had strict data validations to ensure accurate data entry. Everyone involved in the program could track the status of data entry in real time. District officials followed-up with blocks where data entry was lagging and ensured completion of data entry in the stipulated time.

d) An online dynamic dashboard, which was available to all on a website, was created to showcase data in an easy-to-understand manner. The dynamic dashboards allowed people to see the data that was relevant to them and compare their location’s data with other locations.

Information From Mentors

The main cadre responsible for regular school visits and mentoring of teachers was the cluster-level cadre of CRPs. Every CRP had 10-15 schools under their purview. It was critical that the CRPs conduct regular visits to schools to ensure that the program was being implemented as per plan, provide feedback to teachers, review the program periodically and take course-corrective measures. To prepare the CRPs for this crucial role, it was important that the CRPs understand the TaRL approach well. After the training, all CRPs conducted practice classes for 15-20 days. Having used the materials and methods themselves and seen for themselves how children’s learning levels improved, these CRPs were much better able to train and guide the teachers in their charge. 

The following measures were taken to set up a robust monitoring and review structure: 

  • Members of the block monitoring team were linked with specific CRPs to ensure that members of the block monitoring team also had accountability for specific CRPs and their clusters. Every district nodal officer was also assigned a specific block.
  • Ideally, a CRP was asked to make at least 5 visits to each school under his or her charge. At each stage, the CRP provided support and guidance to teachers. Of the five visits, the first visit was made in the first two weeks at the beginning of the intervention, the second was between the baseline and the midline, the third was scheduled immediately after the midline, the fourth visit was between the midline and endline and the last visit was made a few days before the endline. Note that this was a guideline, the CRPs had the freedom to make more visits in the schools that were struggling to make progress. In fact, based on the assessment data, each CRP identified 5 “least performing schools” to provide more support to.
  • Once in a class, every CRP first observed the classroom activities, interacted with children and then demonstrated effective learning activities. They used a School Observation Sheet to make note of important observations. The sheet also acted like a checklist to remind the CRPs about the various things they should be observing –
    • Attendance: overall and of weakest
    • Assessment data use and understanding
    • Grouping: appropriate and dynamic
    • Materials: available and being used
    • Activities: appropriate and participatory
    • Progress: in reading and math
    • Challenges: faced by teacher
  • Review meetings were scheduled twice a month at the block level and once a month at the district level. Nodal officers or BRLs led the meetings at the block level, and the DIET Principals led the district level meetings. The focus of these meetings was on discussing the observations made by mentors and activities undertaken by them in the class to support and improve the situation. The Pratham team members at the district level used to be present in these meetings to facilitate the discussions. The group also worked together to plan guidelines for subsequent visits in such meetings.
  • Review at the state level happened after every assessment cycle to compare progress across locations and discuss field-level challenges and strategies to overcome them. Participants in these meetings were government and Pratham state-level personnel and DIET Principals.
Edit Content

Build mentors’ capacity to support teachers The district teams and CRPs conducting practices classes before the implementation and, helped solve major instructional and logistical challenges prior to the roll-out of the program. These personal observations and reflections led to a stronger belief in the activities and that learning levels can be improved within a 60-day program. Even after the implementation had started, Pratham staff regularly visited schools with the mentors and participated in review meetings to strengthen their mentoring. Buy-in of the mentors to collect data Data collection is often considered as an additional burden, especially by people involved in teaching-learning. But by keeping the data collection processes simple and by making people realize the importance of data, this challenge was overcome to a large extent. Firstly, the number of indicators collected was cut down to only the most useful ones. The forms, portals and dashboards were made such that they were simple to fill and use. And most importantly, insights from the data were made available to the implementers on time. Build people’s capacity to collect, understand and use data During the training at various levels, separate sessions were held to explain all forms and processes related to data collection. Moreover, during review meetings, some time was set aside to discuss the results from the dashboards. Pratham members supported the government officials’ capacity-building with regard to understanding and using data. Technology should suit the ground realities There were delays in data entry of a few locations because of the inaccessibility of computers or network issues. Therefore, it was realized that the data-entry portals, dashboards and reports also needed to be designed in a mobile-friendly manner to increase penetration. Also, it was not necessary to have a common data collection strategy for all locations. Depending on the field situation in various areas, multiple strategies could be chalked out. Keep the focus on action Clear action steps based on data were suggested to the mentors. For example, CRPs were asked to identify five “least performing” schools in their clusters after every round of assessments. The purpose was not to report this information and send to the senior officials, but rather to enable the CRPs to probe into the reasons behind certain schools not making enough progress. This action-oriented outlook towards data was influential in driving change in children’s learning outcomes.

Edit Content
A child stands at the board, writing the word "cesu" in chalk on a mind map during a TaRL reading class in Zambia. On the board, the word "cikolo" is circled, with lines to "bayi" and "cesu".
3. Monitoring Process

In order to properly mentor and monitor teachers, each group of mentors and monitors were equipped with TaRL expertise and data collection tools. Mentors then regularly observed Catch Up classes and gave feedback through one-on-one sessions with teachers. They wrote regular reports and attended review meetings to share TaRL challenges and successes.

Mentor Training

Trainings focused on creating a solid understanding of TaRL methodology through interactive sessions and classroom practice. High-quality training in TaRL methodology, along with classroom practice, helps to ensure that mentors are able to identify when TaRL practices are properly applied, to recognise problems, and to accurately evaluate teacher performance. Mentors were introduced to the monitoring tools and trained in giving feedback, collecting and aggregating data, and writing reports. At trainings, mentors were given a clear idea of the information they were responsible for collecting, how frequently they should collect this data, and the specific tools they should use. Setting clear expectations early on helped mentors to adequately monitor TaRL implementation.

Classroom Observation And Data Collection

Mentors completed classroom observation forms for each of the classes visited. Where possible, they addressed mistakes or challenges they saw in the field, actively coaching teachers, and demonstrating Catch Up methods in the classroom when appropriate. ZICS, supported by VVOB mentors, compiled all observation forms at the monthly zonal review meeting to aggregate the data and create zonal reports.

Monthly Zonal Review Meetings

Regular mentor meetings ensured that there was adequate space and time for discussing programme challenges. Once a month, all mentors within a zone attended a zonal review meeting, during which senior teachers shared aggregated school data and mentors discussed their observations. Mentors brainstormed solutions and actively worked to improve Catch Up implementation in the zone. When problems could not be addressed at the zone level, they were taken to a district level and addressed by the District Education Office. ZICs and VVOB jointly aggregated the data and wrote zonal performance reports.

District-Level Data Collection

ZICs submitted electronic copies of their zone summary sheets to the District Education Office (DEO) every month. The DRCC and corresponding VVOB coordinator jointly wrote a monthly report about district-level Catch Up performance. The DEO was responsible for reviewing the information and deciding if any action should be taken at the district level.

Edit Content

It is often possible for officials to focus more attention on a small number of schools during a pilot programme – although it may not be possible for them to continue to provide this much attention per school as a programme grows to scale. Teams should consider this when designing both the pilot and the scale up. Piloting in a relatively small number of schools may allow local actors to work out the kinks and make better plans for the future, but teams should be realistic about how well pilot systems will work when programmes go to scale.

Monitoring Tools

Choose appropriate data collection tools for the context.

Since many schools in the Catch Up pilot districts had unreliable cell phone signal and no access to Internet, paper tools were used. The data was digitised when aggregated at the zone or district level, but this process proved challenging.

Assessment Data

Collect and share assessment data as quickly as possible in order to allow officials to provide support to schools.

We recommend having data collected and shared within the first 10 days of the programme.

Classroom Observation

Consider whether data collected is useful for decision-making to improve teaching or the overall TaRL model.

An early version of the classroom observation tool required yes/no answers and, when used, did not show much variation in answers, with teachers scoring well in almost all areas. This made it difficult for ZICs and VVOB coordinators to know which schools to target with additional support. In response, the tool was adjusted to include more multiple-choice questions, resulting in a wider range of responses. When working with government monitoring systems aim to have a bare minimum of necessary questions. Early version of the Catch Up tool included several questions which were burdensome for the government system.

Mentoring Responsibilities

Set mentoring and monitoring expectations early, particularly if mentor/monitors play a dual role.

Sometimes, mentor/monitors succeeded in their monitoring role (i.e. observing the teacher, filling in forms, etc.) but failed at their mentoring role (i.e. helping groups of children, demonstrating activities, and actively supporting the teacher in the classroom). Supportive mentorship which teachers find helpful and empowering rather than punitive is a key principle of the TaRL approach. This was a missed opportunity to use the mentors’ TaRL expertise to advise and support teachers in the classroom. This was addressed in part by advising that mentors complete the forms after observing the class.

Classroom Visits

Carefully consider possible scheduling conflicts and have a backup plan in place.

In one district, Catch Up took place during the school holidays, while some senior teachers were attending professional development courses. To counteract this, ZICs and DRCCs were asked to make additional visits, which they did successfully. In some districts, sports and co-curricular activities prevented children from taking part in Catch Up activities on some days, therefore reducing the number of classroom observations that mentors could make.

Driving Action

Guide mentors in using data to improve implementation.

In some cases, it proved difficult to use data to inform specific actions. This indicates that more guidance is needed to effectively use data. In creating monitoring and aggregation tools, implementers should consider how the information will be used at every level and create tools and report templates which point towards action steps.

Mentor Support

Supporting mentors to improve the monitoring process.

Process monitoring of the Catch Up pilot found a lack of monitoring and reporting on the performance of mentors themselves. Having a monitoring and feedback system in place for mentors could help them to more effectively mentor and monitor teachers. For example, more senior mentors could observe classes with senior teachers and provide feedback after the visit.

Data Aggregation

Create user-friendly data aggregation tools and processes.

Paper monitoring tools were correctly used but were difficult and time-consuming to convert to an aggregated electronic form for proper analysis and review at more central levels of the government. This was mitigated in part with the help of VVOB coordinators, who were responsible for supporting the aggregation process. In the scale-up, this challenge is being mitigated by creating a simpler, paper-based aggregation process, which can then be entered using a simple spreadsheet. A broader lesson learned is to simplify the data collection and aggregation process as much as possible to ensure that accurate, useful data is collected efficiently.

Reporting

Set clear expectations for creating reports.

Although district- and zone-level officials were responsible for completing reports, in some cases the VVOB coordinator completed the report instead. In order to ensure that DRCCs and ZICs complete reports, the expectation could be made clearer at trainings and re-emphasised during implementation, to ensure that mentors see reporting as a core responsibility.

Translate »
Scroll to Top