
India
India Malnutrition Program, Monitoring & Evaluation
August 5, 2011
Caitlin McQuilling
“Don’t automate a broken system”
Monitoring and evaluation is often the most difficult part of any development program. It’s often an afterthought for implementers, too busy worrying about rolling out the immediate and the tangible to worry about how they’ll evaluate their work at some later stage.
It’s important that data is not something that’s just written down in a grid each month and never seen again. The strongest programs are the ones in which ground level staff find their reporting useful in their daily work. By making data helpful to ground level staff it makes their reporting in turn more accurate.
Even though RMF put a focus on our M&E from the beginning of our program, it has continued to be one of the biggest struggles in implementing our program on the ground. As our program grows we are not only constantly assessing, analyzing and evaluating our data but also try to give the same level of analysis to the processes by which we collect data.
In this age where there seems to be a tech solution for everything, many development programs make the mistake of thinking that technology will be a “silver bullet” which will fix all of their challenges in the field. The best advice we received from one of the technology experts we were consulting with when deciding which direction we should take our program was “don’t automate a broken system,” meaning that before introducing any new technology, an organization should make sure their fundamentals are solid. As we moved forward with the planning of two innovative technology pilot programs for data collection integrating technology such as mobile phones or digital slates, we also needed to ensure that the fundamentals of our program are strong and that we understand and were honest about our strengths and weaknesses in data collection.
The following is a description of our current M&E system and the steps we went through to refine our processes and fix the problems we encountered.
M&E Process
Quantitative
- Daily Diaries: A simple book in which the CNEs freely record their daily activities and notes in the field
- MUAC Diary: CNEs each record in this daily register the names of SAM/MAM children they see and their Mid-Upper Arm Circumference.
- Triplicate form: a triplicate carbon paper form which the CNEs use to refer children to the NRC and track the referral through the system. One copy goes to the CNE, one to the family, and one is deposited at the NRC which we collect at the end of the month.
- Weekly Reporting format: Using the daily diaries, triplicate forms, MUAC registers the CNEs fill in the weekly reporting format and give that to their supervisors.
- Monthly Reporting format: The Supervisors collect all the CNEs’ weekly reporting formats and consolidate these into the monthly format.
Qualitative
- CNE feedback form: Filled out once a month by the CNEs to provide RMF management information about case studies, challenges, and success stories in the field.
- CNE Needs form: Filled out monthly by putting a simple tally in the boxes where they’ve had to give counseling. Supposed to be a very easy way to assess the counseling needs in the field.
Process of verifying our data
We held our own internal audit of the first year of our reporting (May 2010 – March 2011). Since our program is reporting big numbers and getting a lot of attention from government and NGOs, we wanted to be sure internally that our data was airtight and accurate.
Thanks to having a multi-layered MIS, we were able to go to the source of our reporting to get the accurate data. By back-tracking of the data we were not only able to verify our data down to the individual child but also identify at which steps our MIS wasn’t working well.
Each CNE maintains a daily diary where she notes down information on the village she visited each day. Then she copies all the information on children under 5 into a MUAC register, where she records the MUACs of children on each visit to the village over the months. In this we have a full year record (or whenever the child was first identified) for each individual child. This register is used by the CNE on a daily basis so that she can remember the history of each child she visits and so that she can see whether the child is improving or worsening on each visit to the village and can direct her counseling accordingly. We believe that we have accurate MUAC registers for all CNEs, except for a few CNEs who we let go for poor performance. For those villages we had the new CNEs we hired do fresh surveys and collect fresh data and compared that with the questionable data.
The CNEs use this register to fill out their weekly reporting format, which is submitted to their supervisor each week. This format is where the problems with calculating and addition started.
The Coordinators then collect all the CNEs weekly formats and use those to create the district monthly aggregate report. This is also where some errors occurred.
The weekly reporting formats and monthly reporting formats were filled out in hard copy by CNEs and District Coordinators, who did math by hand or using their cell phone calculators. This many times led to human error which was not picked up until later when the data was entered into excel sheets by our data entry operator. It was also a cumbersome process for the coordinators to consolidate all this data on a monthly basis and often took longer than RMF management would have liked.
Dr. Athar Qureshi, RMF’s Director of Programs, worked with the coordinators to create a new format, by village, where we reworked the totals for each village by month. This gave us a more accurate number. The Coordinators spent a weekend filling in all the data and checking the math.
Once the data was in an excel spreadsheet the team analyzed it and compared it to the original data submitted, the baseline data, and the NRC survey. The NRC survey and baseline data are results we’re sure about because we can link those to the individual children, so those are good points to verify the data from. We found that most of our data was reported accurately, with minor errors here and there, but that the process in which we collected our data was extremely time consuming and even more time consuming to go back and check. This also made it difficult for the District Coordinators to apply the program data in the field and to do cross checking of reports which CNEs submitted.
During this review we also realized that there were many activities CNEs were conducting on a daily basis which were not reflected in our reporting formats. CNEs recorded activities that were not reflected in the reporting formats as notes in their daily diaries, but each CNE recorded these activities in their own method. During the data cross-checking these daily diaries proved to be very useful for checking numbers, names, and dates whenever there was a question in the reporting formats. RMF realized the utility of having uniform reporting and a structure to these daily diaries while still allowing the CNEs some free area to write their personal notes.
We also faced some difficulties in the formatting of data from month to month as the overall compilation of the data was done by different people at different time intervals. Before January of 2011, RMF did not have a data entry operator and instead all program data was entered by DCs or program managers. Depending on how busy various individuals’ schedules were, one individual would enter the data for a few months and then another person would take over.
Lessons Learned
While this review of program data was tedious and painful at times, it was a tremendous learning experience for all staff members involved. For the CNEs it was a process of reflection on the quantum of work they had done and also a time to formally point out questions they had and challenges they faced. By spending so much time sorting through the program numbers, DCs gained an increased familiarity with what the numbers were actually capturing and became much more comfortable with data. RMF’s program management also got to see where the gaps in reporting were and where we could support program staff better.
The following were some of the key lessons we learned and changes we’ve made to our reporting system:
- The Daily Diaries and MUAC diaries are key, but structure is important
- Someone needs to “own” the data