Six questions for the new ICAI
The Annual Report of ICAI was published at the end of June, and a new team of Commissioners took over at the beginning of July. In previous years, I have reviewed ICAI’s work and commented on its future plans (see here, here, and here). This year, it seems more appropriate to focus on the questions that should be asked at the beginning of a new mandate, by ICAI itself, and by its political masters, the International Development Select Committee.
There are six such questions. They are:
- Does ICAI accept the recommendation of the Cabinet Office Review of 2013, that it should focus less on routine scrutiny of spending and more on exploring high-level development issues?
- If so, what are those issues?
- How exactly does ICAI differentiate itself from the NAO and DFID’s own evaluation programme? Are the boundaries right?
- What exactly is ICAI’s ‘method’?
- How should ICAI approach lesson-learning and synthesis?
- How can ICAI best support the IDC?
1. Does ICAI accept the recommendation of the Cabinet Office Review of 2013, that it should focus less on routine scrutiny of spending and more on exploring high-level development issues?
The Independent Commission on Aid Impact was set up to assess the impact of aid: that is what it says on the tin, and in the guiding documents. ICAI itself says on its website that ‘the Independent Commission for Aid Impact scrutinises UK aid spending’. Most of its 46 reports have been true to that mission. For example, when it looked at trade development in Southern Africa, it concentrated on aid-for-trade projects rather than the impact of trade policy. There was no discussion of EUEuropean Union trade policy or of Economic Partnership Agreements. Similarly, when it looked at climate change, its focus was on a spending instrument, the International Climate Fund, rather than, say, the impact of the UK’s negotiation strategy. The same was true of its climate review in Bangladesh, back in 2011.
But spending is only part of DFID’s job, and many would argue a declining part, despite the recent increase in the aid budget. The Select Committee in the last parliament looked at the Future of UK Development Cooperation, and concluded specifically that policy issues were likely to predominate in the future. It said that
‘Aid remains essential for the poorest countries, and for some purposes in middle-income countries (MICs). . . However, overall, a new approach is required which reflects the changing situation. First, as aid is no longer provided to some MICs, such as India, new forms of co-operation have to be developed which facilitate links with UK institutions in a wide range of areas, including health, education, culture, law, culture and science. . . Second, policy coherence for development (PCD) is at the heart of a new approach. This means working across Government in the UK, and with global partners in the multilateral system, to maximise the impact on development of all the UK’s actions.’
Does ICAI exclude itself from this growing area of work, or does it want to ‘play’ in the new space. More importantly, does the Select Committee want it to do so? An easy question, really. Surely, the answer must be ‘yes’.
If ICAI were to expand its remit, the triennial Cabinet Office Review of ICAI, completed in 2013, offers room for manoeuvre. While saying that ‘ICAI’s role is to provide independent scrutiny of UK aid spending’, it goes on to say that the original purpose (‘to produce a wide range of independent, high quality and accessible reports setting out evidence of the impact and value for money of UK development efforts’) should be revised to read
‘Carry out a small number of in-depth thematic reviews addressing strategic development issues faced by the UK Government’s development programmes, combined with additional short reviews (where needed) to address specific issues of interest/concern to key stakeholders’.
‘Thematic’, ‘strategic’, ‘issues’ and ‘programmes’ are all enticing words, consistent with a broader mandate. ICAI and the IDCInternational Development Committee could seize on those in defining the work programme. In case of doubt, a triennial review will be due in 2016, which will give the opportunity to confirm a new direction. By then, there might be some reports to demonstrate the value of the shift.
2. If so, what are those issues?
If a new approach is taken, what should be covered? A detailed answer is beyond the scope of this short piece, and better begins with a reading of DFID’s results framework. However, it is worth remembering the recommendation of the IDCInternational Development Committee that DFID’s results framework should be revised to take greater account of policy issues. The IDCInternational Development Committee said that ‘there is a section in the latest Departmental Report which covers PCD, but the key themes are not visible in DIFD’s overall objectives or in its results frameworks, which focus too much on spending. Furthermore, DFIDDepartment for International Development only offers a partial lens on the UK Government’s policy and activities.’ A new Departmental Report is due shortly. In the meantime, the 2014 Departmental Report contains a discussion of results. This paper, from September 2014, gives a more detailed account of a results framework which links the MDGsMillennium Development Goals to outcome indicators for the bilateral and multilateral spending programmes.
Neither of these historic DFIDDepartment for International Development documents deals well with policy issues, and to the extent that they focus on high-level development indicators, anyway use the MDGsMillennium Development Goals rather than the new, broader SDG framework. Clearly, a new results framework will be needed after September, when the SDGs are adopted. The new framework will also need to encompass the range of non-aid issues. For the record, the list of issues presented to the IDCInternational Development Committee in written evidence is illustrated in Figure 1. DFID’s spending programmes for the private sector have been the subject of ICAI review, but few others have featured. It would be great for ICAI to pick up some. A new mandate, indeed.
Beyond Aid issues identified to the IDC
3. How exactly does ICAI differentiate itself from the NAO and DFID’s own evaluation programme? Are the boundaries right?
As it develops a work programme under new leadership, a key question for ICAI is how it differentiates itself from other evaluation and scrutiny bodies, especially the NAO and DFID’s own evaluation programme. This is a question I have asked every time I have written about ICAI, and one which remains unresolved.
The NAO does not seem to present too much of a problem. It only produces three or four reports a year on development-related topics, including informative and analytical background briefs which support IDCInternational Development Committee oversight of DFID. In 2014, it produced one additional report, on the Private Infrastructure Development Group. In 2015, it has produced one report on managing the oda target, and is working on another, on DFIDDepartment for International Development Responding to Crises.
DFID’s evaluation programme is a different story. The latest annual report on evaluation was published in February 2015, covering the year 2013-14. It reports that 31 evaluations were completed during the year, of which 27, costing £8.48m, were published. A further 216 (sic) are planned or ongoing. The list includes an evaluation of the International Climate Fund, reviewed by ICAI in 2014. This volume of work compares with 10 reports published by ICAI in 2014, at a cost of £3.7m, including core admin costs for ICAI, and a total of 46 reports published during its lifetime.
The evaluation department coordinates and supports a wide range of evaluations, embedded in DFIDDepartment for International Development programmes. Interestingly, the Annual Report distinguishes different kinds of review, with different cost, as in the table below. It would be interesting to ask ICAI to classify its own work in a similar way, although DFID’s own classification has changed a little in the last year.
DFID Evaluation types and costs 2013-14
Unlike, say, the Independent Evaluation Group of the World Bank, neither the DFIDDepartment for International Development evaluation department nor ICAI pretend to offer comprehensive coverage of all projects and programmes, a strategic choice for which they have both been criticised. ICAI, at least, was not set up to do this. Nevertheless, both offer a wide selection of assessments. The Table below shows the list of 27 reports published by DFIDDepartment for International Development in 2013-14. It is interesting to ask which of these could not have been published by ICAI – and which ICAI reports could not have been published by the Evaluation Department. There are some notable duplications, for example Trade Mark South Africa.
Evaluation Reports published by DFIDDepartment for International Development 2013-14
- South Asia Food & Nutrition Security
- Mobile Cash Transfers in Northern Afghanistan
- ODI Budget Effectiveness Programme
- Africa Democracy Strengthening
- Shelter, Sanitation, Recovery & Resilience Project - aka FRESH (Bangladesh)
- Global Mine Action
- International Citizen’s Service (Communications)
- UN Joint Programme on Gender Equality & Women’s Empowerment in Ethiopia
- Ghana Electoral Support
- Hunger Safety Net Programme 2009-2012 (Kenya)
- Community Land Use Fund (Mozambique)
- Koshi Hills (Nepal)
- Palestine Country Programme Evaluation
- Responsible & Accountable Garment Sector - RAGS
- Statistics for Results Facility
- Accelerated Data Programme
- Results Based Aid in Rwandan Education
- Trademark South Africa
- Southern Africa Regional Social & Behaviour Change Communications Programme
- Multi Donor Trust Funds National (Sudan)
- Tanzania Poverty Reduction
- HIV/Aids Prevention (Vietnam)
- Support to the INGOInternational Non-Governmental Organisation Consortium in Yemen
- Integrated emergency response Yemen
- Zambia Social Protection Expansion
- Programme Protracted Relief (Zimbabwe)
- ColaLife Operational Trial Zambia
This does, of course, raise an interesting question, about whether ICAI considers itself to be carrying out ‘evaluation’ of some kind or another. Again, that is something critics have questioned. It raises a new issue about exactly what kind of work ICAI thinks it is doing and should be doing.
4. What exactly is ICAI’s ‘method’?
ICAI describes itself as a ‘scrutiny’ body, and has carried out different kinds of study, including of projects, country programmes, sectors and overall aid management issues, like DFID’s approach to learning or its work on results. Links are on this page of the ICAI website. I don’t know how those would map onto the classifications used by DFID’s evaluation department, or the even longer list of evaluation types that DFIDDepartment for International Development has used to explain the scope of evaluation work (see here for information and discussion of this, from my review of ICAI work in 2013, but remember that new classifications will be reported in the next evaluation report):
- Mixed Methods
- Experimental/ Quasi Experimental
- Formulation of evaluation questions
- Thematic/ Sectoral
- Country Programme
- Looking at Theory of Change
- Participatory Evaluation
Classification apart, the underlying question is about the difference between ‘evaluation’ and ‘scrutiny’, and comes to a crunch point in the debate about whether ICAI should be carrying out independent field work and consulting beneficiaries on its own account. I have expressed doubts about that in the past, as have others. Of course, a project, working for example on schools or health centres in a country, should state expected results, set benchmarks and measure progress. Measurement would normally include beneficiary views. The data the project collects should be available to stakeholders, including evaluation and scrutiny bodies. It might be the case, unusually, that an evaluation would find it necessary to carry out additional field work, in order to validate findings or fill gaps. It would be very surprising to me if a scrutiny body found it necessary to do the same, except on an informal basis to educate its team or perhaps sample some of the data collected.
This is important, and contrary to current ICAI practice. ICAI has made a virtue of its consultation processes with beneficiaries, and in some cases has commissioned quite large-scale surveys. I think that is a mistake. The questions ICAI need to answer are not, for example, ‘what do beneficiaries think?’, but rather ‘has the project asked what beneficiaries think?’ and ‘has evaluation of the project confirmed that adequate data on beneficiary perceptions has been collected, analysed and disseminated?’. If ICAI Commissioners want to talk to a few beneficiaries, that is fine, but they should not be allowed to persuade themselves that rigorous field work is part of their remit. Scrutiny, in the end, is about quality control rather than research. Actually, so, in most cases, is evaluation.
It is just possible that the focus might shift from quality control towards research if ICAI’s own focus shifts from project-level work to higher-level policy issues. Even here, however, the focus needs to be on assessing whether DFIDDepartment for International Development (or other parts of HMG) are carrying out the right research and analysis themselves, for example on trade policy or business regulation. In addition, it would be very interesting for ICAI to review DFID’s evaluation work.
5. How should ICAI approach lesson-learning and synthesis?
It follows from the above that I am less worried than some of ICAI’s critics about the fact that its reports are selective in coverage and more resemble thoughtful essays than rigorous research reports. That seems to me to be consistent with the scrutiny role. At the same time, however, the scrutiny function implies careful attention to lesson-learning and synthesis.
Sometimes, this can be achieved through ICAI’s main reports. There have been several thematic reports, on topics like effectiveness, impact, learning, or the impact of Smart Rules. Otherwise, synthesis can be achieved by means of analysis in the Annual Report. This took some time to take off, but was well done in the 2014 report, which listed nine overall lessons, many of which were quite challenging to DFID.
The 2015 report is less detailed in this respect, or at least the lessons are less explicitly framed. The Report contains a chapter entitled ‘Significant themes’. There are three sections to the chapter, viz: (a) Major Challenges Facing UK aid; (b) Achieving Transformative Impact; and (c) What Should DFIDDepartment for International Development do differently. Thus, no bullet point list of lessons. The nuggets are there, however, and we learn that:
- DFID needs to improve both its approach to and capacity to work on issues of economic development and the private sector. Its strategy is not based on a clear view of comparative advantage and lacks focus.
- DFID has yet to grasp the full implications of the fact that it is, to a significant extent, a specialist in fragile states.
- DFID is weaker at responding to long-term or chronic emergencies than to sudden-onset emergencies.
- With regard to the multilateral system, DFIDDepartment for International Development has focused on organisational effectiveness at the expense of strategic dialogue, and yielded thought leadership in favour of pushing a narrow results agenda.
- Short programme cycles of 3-5 years work against a long-term approach to impact, and staff favour innovation over continuity.
- Beneficiary involvement has only recently begun to improve.
- Flexibility of delivery needs continual attention.
- Too many DFIDDepartment for International Development programmes live in sectoral silos.
- Too much effort goes into sometimes spurious calculation of value-for-money.
- The emphasis on results has resulted in cumbersome procedures and heavy transaction costs.
- Risk management is not yet up to scratch.
- There is often a disconnect between DFIDDepartment for International Development country programmes and the activities of central programmes.
- DFID staff learn well as individuals but that much knowledge is lost across the organisation.
- Influencing is not captured in results frameworks.
- Better communication is needed with stakeholders.
It is important to note that these points are framed in a generally complimentary assessment of DFIDDepartment for International Development capabilities and performance, and also that some will no doubt question the evidence. I myself have argued that it would be useful to have more comparisons in the analysis – for example, with respect to learning, DFIDDepartment for International Development is among the leaders in Whitehall and certainly as good as any other donor. ‘Compared to what?’ is always a useful evaluation question. I am a DFIDDepartment for International Development enthusiast. Still, some of the barbs resonate, with the general gossip about DFID, and with more judicious reviews, like the DAC peer review of the UK.
But, anyway, I did not want to be distracted by substance this year, but rather to make the point that ICAI could do more to make its findings visible and longer-lasting. I learn from the DFIDDepartment for International Development Evaluation Annual Report that the Department produces Thematic Briefs, which summarise findings across evaluations, and make them more accessible to stakeholders. I can’t actually find these briefs on the internet, and this may be because they are not published, but the Annual Report lists them as follows:
- Security and justice
- Support to political processes
- Social accountability and social empowerment
- Violence against women and girls
- Financial inclusion
- Climate change adaptation
- Low carbon development
A Freedom of Information request, anyone?
Would it be a good idea for ICAI to institutionalise learning and make it more accessible? That would be consistent with its own advice to DFID. Can we look forward to more thematic reports, and also perhaps ICAI Thematic Briefs?
6. How can ICAI best support the IDC?
Finally, ICAI has an important role to play in supporting the International Development Select Committee, to which it reports. The IDCInternational Development Committee approves the ICAI work programme and receives its reports. It has regular sessions with Commissioners. There is a sub-Committee specifically responsible for ICAI, chaired in the last Parliament by Fabian Hamilton MP. Should anything change?
First, the creation of the Sub-Committee was an important innovation. It is important that the sub-Committee, and the IDCInternational Development Committee more widely, oversees the management of ICAI as well as making use of its results. For example, it was important that a member of the IDCInternational Development Committee was involved in selecting the new Chair of ICAI, and that a pre-appointment hearing was held. This level of operational involvement should continue.
Second, the IDCInternational Development Committee should certainly encourage ICAI to broaden its mandate beyond spending, as discussed above. This may mean ICAI becoming more involved in policy issues, something it has been careful to say it avoids, even though it palpably does touch on policy – and which sometimes the IDCInternational Development Committee itself has been chary of it doing. I do not see conflict of interest. In effect, ICAI proposes and the IDCInternational Development Committee disposes.
Third, if it is true that ICAI proposes and the IDCInternational Development Committee disposes, I wonder whether reports should be approved by the IDCInternational Development Committee and owned by them before they pass to DFIDDepartment for International Development for comment, and if the comments should come to the IDCInternational Development Committee rather than directly to DFID. ICAI, after all, is an ‘Advisory’ NDPB. Perhaps this is what happens already, but in any case, it is important that the IDCInternational Development Committee make the best possible use of ICAI Reports.
Finally, the work programme of ICAI obviously needs to be dovetailed with the IDC’s own work programme. This means making sure that ICAI makes substantive contributions to IDCInternational Development Committee enquiries, including by submitting formal written evidence, and also, perhaps, timing its studies and reports in order to contribute to the IDCInternational Development Committee programme. For example, at the time of writing, the IDCInternational Development Committee has just announced a timely enquiry into the new Sustainable Development Goals. Given the lesson-learning reported above, and the emphasis on the MDGsMillennium Development Goals and SDGs in ICAI’s Annual Report, it would be very valuable for ICAI to submit evidence to this new enquiry, and to be available for oral cross-examination. This would inevitably draw on past work rather than new reports, but would be a useful introduction of the new team of Commissioners.