Is ICAI getting there?
I know that ICAI is still quite controversial – see for example this Twitter conversation , led by Owen Barder. My view is that they are ‘getting there’: boosted by a triennial review which has sharpened their mandate; overseen in a better-stuctured way by a sub-Committee of the House of Commons International Development Select Committee; and gaining the confidence to make over-arching and analytical comments about DFID. A new Commission is currently being recruited, to take office next year, so we have to hope momentum will be maintained and further improvements made.
This is the third time I have reviewed ICAI’s work. See here and here for previous comments and suggestions. My reviews contribute to the work of the Select Committee, to which I am a Specialist Adviser.
This year’s annual report is well done, comprehensive in its coverage of ICAI’s work (including follow-up on previous years’ reports) and with a useful analytical overview of findings to date. Note that the Select Committee took evidence on the annual report on 8 July, from ICAI and from DFID.
The triennial review of ICAI confirmed its usefulness and made suggestions about the future balance of work, to which ICAI is responding, especially with more thematic reports and relatively fewer project reports).
The timing of ICAI’s Annual report is such that it pre-dates both DFID’s annual evaluation report (see here for the 2013 edition) and the annual Departmental Report (see here for 2013). It would be useful to consider all these together, in terms of coverage as well as lessons learned.
There still remain questions about the relationship between ICAI, the NAO (briefly covered in the Report) and DFID’s own evaluation work (not covered at all).
- The NAO (see here ) appears to have done relatively little on development issues in the past year, apart from its overall performance review and two (excellent) briefs for the IDC. It did, however, undertake a report on malaria , published in July 2013.
- The evaluation work of DFIDDepartment for International Development is a different matter. According to the ICAI review of learning , 40 evaluations were completed in 2013-14 and 425 (sic) were either underway or planned in July 2013 (para 2.34). According to the ICAI report, DFIDDepartment for International Development has committed over £200m to evaluation and has 55 specialist evaluation posts (para 2.30). All these numbers dwarf the ICAI budget, personnel and volume of reports. ICAI produced 12 reports in 2013-14. Its expenditure was £3.4m, of which £3.0m was programme spending and the balance secretariat costs (Section 11.2 of the ICAI report). It would be useful to tabulate DFID, ICAI and NAO reports, in order to compare scope and coverage: this cannot be done until the new evaluation report is produced.
The ICAI report devotes ten pages to reviewing follow-up from previous years’ reports. In general, it considers that ‘DFID has responded very well to a number of our recommendations’ (Pg 24). However, ‘we remain concerned about how long it took to address some issues. . . (and) we have noticed a lack of consistency in some of the Year Three management responses from DFIDDepartment for International Development . . . We believe that a consistent approach is required: that if DFIDDepartment for International Development accepts that improvements are required and indeed it states that it is already doing some of the necessary work, it should fully accept ICAI’s recommendations and give full management actions’. (Pgs 24-25).
The Report summarises the main results of the 12 reports published during the year. The list of reports and the overall scores are given below. Two reports were awarded the highest score (green), five green-amber, four amber-red and one red.
The report also lists a total of nine overall lessons from its work. Its overall conclusion is that ‘DFID at its best is capable of outstanding performance and is rightly recognised as a global leader on many aspects of development assistance’ (Pg 26). However,
- DFID’s global portfolio faces many new challenges in the coming years.
- DFID’s strategies and theories of change are usually strong at the individual programme level but weaker in respect of more complex interventions with multiple components.
- DFID’s corporate results agenda has brought greater rigour and discipline but can also distort programming choices.
- With a few exceptions, DFIDDepartment for International Development effectively targets the most vulnerable with its assistance.
- DFID’s choice and oversight of its delivery partners has emerged as a key vulnerability in the effectiveness of UK aid.
- Sound programme governance and an active role for intended beneficiaries are key success factors.
- DFID is paying more attention to the fiduciary protection of UK funds but lacks clear strategies for ensuring overall value for money.
- DFID staff learn well as individuals but corporate learning is not pursued consistently.
- DFID’s programmes are not flexible enough to maximise learning within the lifecycle of each project.
There are some quite sharp challenges embedded in these comments. For example:
- ‘As DFIDDepartment for International Development scales up its fragile states expenditure, it needs to adopt programming approaches that are simple and robust enough to succeed in a difficult and volatile context. . . . (In Afghanistan) we found that it was consistently the simpler interventions (such as building roads) that were more likely to reach successful completion and deliver meaningful results for the intended beneficiaries. By contrast, more complex programmes, such as those focused on job creation, were harder to adapt to the difficult context and were not based on strong evidence as to what works.’ (Pgs 26-7)
- ‘We have seen some examples of very good DFIDDepartment for International Development programming in middle-income countries, particularly India. The impact of DFID’s support in this environment depends less on the volume of financial support and more on its ability to act as a purveyor of development excellence, helping its partner countries to identify innovative solutions to their economic and social challenges. . . . This kind of engagement, based on knowledge partnerships rather than large-scale funding, is likely to become more important in less aid-dependent contexts and as the focus of the international development agenda shifts from scaling up basic services towards tackling hard-to-reach groups and areas.’ (Pg 27)
- ‘DFID’s strategies and theories of change are usually strong at the individual programme level but weaker in respect of more complex interventions with multiple components.’ (Pgs 27-8)
- ‘We have been less convinced by DFID’s strategies for achieving more complex development goals – such as private sector development – through portfolios of linked interventions.’ (Pg 28)
- ‘We are concerned . . . that the emphasis on headline targets is distorting the way results are reported. . . We are also concerned at the proliferation of ‘reach indicators’, which record the numbers of people nominally benefiting from DFIDDepartment for International Development programmes, whether or not they experience developmentally significant change. . . . We advise DFIDDepartment for International Development to avoid this kind of results inflation and to focus on meaningful results measures, rather than superficially impressive ones. . . This is not just a question of over-optimistic reporting on results. It can also distort programming choices and create a disincentive to organisational learning. In our private sector development review, pressure for early results led to a focus on ‘quick wins’ and easily scalable outputs (such as training) over harder-to-measure and longer-term results that might be more transformative. We are also concerned that pressure for quick results may compromise sustainability. Some of the most important results in the programmes we have reviewed have taken time to emerge and become sustainable’. (Pgs 29-30)
- ‘One of our most persistent concerns across our reviews has been the lack of due attention by DFIDDepartment for International Development to the management of delivery partners.’ (Pg 31)
- ‘We noted that the procurement process created incentives for firms to over-promise on results and underestimate their costs in order to win bids, causing problems with delivery. Furthermore, delays in procurement led to DFIDDepartment for International Development placing pressure on contractors to meet unrealistic spending and results targets, forcing them to shortcut key processes like stakeholder engagement. Across our reviews, we noted a lack of mechanisms for learning within contractor teams to be shared across DFID.’ (Pg 32)
- ‘While we found some good emerging results from (the PPA) mechanism, we were concerned that DFIDDepartment for International Development seemed unclear in what it hoped to achieve from it, which diminished its overall impact. We found evidence that the funding had led to a stronger focus on managing for results. Some of the NGOs, however, felt under pressure to use the funds for activities that yielded readily reportable results, which partly undermined the value of core funding.’ (Pg 32)
- We found the tri-departmental Conflict Pool (the Foreign and Commonwealth Office, the Ministry of Defence and DFID) to be cumbersome in its management structures and processes. The heavy effort involved in cross-departmental working was detracting attention from strategy and results. . . The UK government has announced the creation of a new Conflict, Stability and Security Fund, under the authority of the National Security Council, as part of a drive to improve the coherence of the UK’s engagement overseas. We will be keeping a close eye on this new mechanism.’ (Pgs 33-4)
- ‘Our review, ‘How DFIDDepartment for International Development learns’, found that DFIDDepartment for International Development lacks processes to ensure that experience gained from its programmes is consistently captured and shared. DFIDDepartment for International Development managers are not accountable for learning within their teams and heads of country offices do not always create a culture that is conducive to learning.’ (Pg 35)
- ‘We find a relative neglect of monitoring and learning within the lifespan of individual projects. Designs are often too rigid in conception, which can be reinforced by cumbersome approval processes and an overly mechanical approach to the use of management tools such as logframes.’ (Pg 36)
It is a pity that ICAI is not always clear about the standard against which it is comparing DFIDDepartment for International Development activity. It does sometimes feel as though ICAI is setting an impossibly high standard, so that DFIDDepartment for International Development is bound to fail. The learning report was a case in point. ‘Who does this better?’, one wants to know, in general, in Government, or in the development sector? More comparisons would definitely be useful.
One way to find better comparisons is to carry out joint evaluations, comparing the work of different agencies on the same topic or in the same place. The Development Assistance Committee of the OECDOrganisation for Economic Cooperation and Development has experience in this field, through its work programme on evaluation . An example of joint evaluation is the joint evaluation on general budget support , completed in 2006. A more recent example is the joint evaluation of Irish and UK aid to Tanzania , completed in 2011.
It is also interesting to compare ICAI’s learning with those in DFID’s own evaluation summary. As noted, the 2014 report is not yet available. However, the 2013 report contains some useful insights, organised by sector of interventions and identified as ‘initial indications’. Some points are:
- ‘The importance of not going to scale too early, and focusing support on a limited number of districts and sectors the importance of strong political economy analysis, reflected in clear theories of change, as essential to any public sector reform; also the value of working on both bottom up and top down change processes, and the need to recognise longer-term timeframes and to focus on changes that are sustainable.’
- ‘The value of well researched knowledge and policy products for building capacity of regional economic communities and member states in specialised skills such as international trade negotiations, dispute settlement procedures, and international contractual arrangements; and the role of flagship reports in stimulating debate on policy options at the highest levels.’
- ‘PFM reforms tend to be more effective when donors’ efforts are directed under an overarching reform strategy owned by the Government opportunities to link PFM reform to other relevant reforms such as decentralisation.’
- ‘Effective mobilisation of youth is key to ensuring the sustainability of peace building activities’
- ‘There is very little evidence of wider social, behavioural and economic impact of (community development) interventions; and no evidence of the effects of gender parity at community level.’
- ‘A common finding from these (productive sector) evaluations for policy-making is the increased effectiveness of interventions that combine practical support in the form of assets or cash with capacity development in the form of training, education or counselling.’
- ‘Experience across these productive sector programmes provides support for two key principles for improving the effectiveness of programme implementation:
- programme effectiveness is enhanced by aligning programmes with national government strategies and plans.
- effectiveness is enhanced by selecting implementing partners with strong local capacities and theability to tailor programmes to the local environment.
- ‘(In Burma) no donor could have achieved the same results as the pooled fund had done.’
- ‘Depending on the country context and service delivery mechanism, community response can be effective at increasing knowledge of HIV, promoting social empowerment, increasing access to and use of HIV services, and even decreasing HIV incidence, all through the effective mobilisation of limited resources.’
Two interesting questions from this list (presumably to be updated in 2014) are (a) whether they are qualitatively different to those obtained by ICAI – which reflect on the question of whether or not ICAI offers real value-added; and (b) whether or not ICAI has taken this work into account in formulating its own recommendations.
An important question is whether the 2014-15 work programme is adequate to follow-up on strategic issues raised, as well as fulfil ICAI’s other mandates. Nine reports are planned for 2014-15, giving a good range of work. They are listed below. The multilateral report will be a useful adjunct to the planned Multilateral Aid Review.
- DFID’s approach to anti-corruption
- The International Climate Fund
- The scale-up of aid spending
- Security and justice programmes
- DFID’s funding of multilaterals
- DFID’s Use of the private sector (follow-up)
- Ministry of Defence Overseas Development Assistance Spending
Another issue is whether ICAI should be less coy about ‘policy’, as in the following statement at the beginning of the Annual Report:
‘The International Development Committee, chaired by Sir Malcolm Bruce MP, is the House of Commons select committee that provides oversight of the Government’s aid policy – a clear difference between their and our responsibilities, as policy is outside of our remit. ‘
In fact, ICAI reports (a) examine DFID’s policy work (for example, in country assessments), and (b) have implications for policy. Would it not be more honest, and interesting, to make policy issues central to the analysis of aid impact? This could be done without trespassing on the policy remit of the IDC. Indeed, it might well enhance complementarity.
Finally, none of the above should be taken to suggest that there is not more work to do. The Select Committee report on ICAI, forthcoming, will doubtless make many useful suggestions.