IPE CKD Scholar

What We Do > Our Sectors and Practices

Monitoring, Evaluation and Learning

The distinct yet related activities of Monitoring, Evaluation & Learning (MEL) offer a cross­cutting service to all our other areas of expertise and are integrated into every stage of our work from design through implementation to lesson-learning at the end of a programme to ensure we deliver lasting results and maximize Value for Money.

We design and implement flexible, participatory MEL approaches and also provide tools and technical solutions to monitor the present, evaluate the past and shape the future. With gender and inclusion being a key component of our work, our approaches are focused on continual improvement and learning within organisations, development sectors and programmes.  We have an established track record in the conduct of complex evaluations, and are at the forefront of thinking in MEL for adaptive programme management. Through our interventions, we contribute towards improved performance, accountability and demonstrating the value of client investment so that value for money can be optimised. The breadth of our in-house expertise, coupled with our practical experience in many of the key sectors lies at the heart of our M&E reputation and service.

Promila Bishnoi

Aditya Khurana

Clarissa Poulson


  • DFID: Impact Assessment of TERI Programme for Clean Energy Access and Improved Policies for Sustainable Development, India (2015-16)

    IPE Global evaluated the impact of DFID’s funding to TERI to implement a programme for Clean Energy Access and Improved Policies for Sustainable Development in India and East Africa. The evaluation looked at the impact of DFID’s financial support for piloting of improved technologies and new business models that promote use of improved cook stoves and solar home lighting in India and Africa. In addition to evaluating the outcomes and impacts of the programme using a mixed methods approach, the team also assessed the utility of project data towards programme management and identified lessons of what worked and what did not work and why.

  • DFID: Impact Evaluation of Westminster Foundation for Democracy (2013-2015)

    IPE Global evaluated the effectiveness of the Westminster Foundation for Democracy (WFD) in supporting democracy in more than 15 post-conflict and fragile countries, through parliamentary strengthening programmes and political party assistance initiatives. The evaluation produced an assessment of what results WFD’s programmes yielded or contributed in the set of countries selected, lessons learnt; and recommendations for WFD’s future implementation. The evaluation provided evidence for the UK government’s future decisions on democratic assistance in post-conflict countries; a clearer understanding of whether the UK is helping the countries concerned to move in the right democratic direction; and an assessment of the most effective approaches and in different circumstances.

  • DFID: Development of a Learning Theory of Change for DFID: Pathways to a Learning Organisation (2016)

    IPE Triple Line worked with DFID’s Evaluation Department and the Research and Evidence Division (RED) to develop a Theory of Change and an action plan for DFID to evolve into a learning organisation in June and July 2016. The team worked with a number of stakeholders within DFID to understand current learning mechanisms, appetite and critical path to change, and to identify barriers and enablers to change happening. The team also tested the assumptions in the Theory of Change to critically examine the outcomes and outputs to ensure they are coherent, consistent and logical within the flow of the pathways – and that realistic assumptions underpin the pathways between them and the impact statement. This assignment was part of DFID’s efforts to take forward the recommendations from two reports on how DFID learns – the Independent Commission on Aid Impact’s Review of ‘How DFID Learns’, and the Cabinet Office’s ‘What Works Review’ on the use of evidence in DFID.

  • DFID: Summative Evaluation of DFID Health Partnership Scheme (2016)

    IPE Global and IPE Triple Line undertook a summative evaluation of DFID’s Health Partnership Scheme in 2016. The Health Partnership Scheme (HPS), is a key DFID programme which aims to address the critical health worker needs in developing countries. HPS supports partnerships between UK health institutions and those in low income countries, through health service skills transfer and capacity development. The summative evaluation of HPS used a multi-level based framework to assess the impact and effectiveness of the programme and its progress towards achieving desired outcomes and its impact. Our team conducted a systematic in-depth analysis of the qualitative aspects that facilitated wider lesson learning about building health worker capacity in developing countries and the reciprocal benefits of partnerships in the UK. The evaluation used a participatory evaluation design process that built on and expanded existing review mechanisms and involved key stakeholders to support learning and buy-in. The team used a variety of complementary and innovative methods to generate additional data to support the existing monitoring data.

  • DFID: Evaluation Management Unit (EMU) for Forestry, Land-use and Governance (FLAG) programme, Indonesia (2015-18)

    IPE Global and Triple Line are managing the EMU for DFID’s FLAG programme in Indonesia for a 3-year period (2015-18). FLAG is a £32.5 million programme that aims to deliver effective and transparent land-use systems, government accountability at provincial level, and transparency on land licensing decisions. FLAG supports the improvement of sustainable and responsible business, particularly in palm oil, and promote alternative approaches to large scale deforestation. Adaptive M&E, for adaptive programme management, is at the heart of the way in which we are currently working as the lead of the Evaluation Management Unit (EMU) to develop and implement a framework for evaluating the programme at the project and programme levels. The evaluation framework ensures that the data is gathered and analysed for each project, and then synthesised for the evaluation of the overall programme. We are assessing the results achieved by FLAG and implement an adaptive learning model, which will support evidence-based decision making regarding the scale-up or redesign of the programme interventions.

To view complete project list, click here.