Monitoring, learning and evaluation for a complex world
Latest posts
Share:
Complex problems need dynamic MEL approaches
Systemic approaches to complex change offer the opportunity to tackle our most persistent challenges at scale – like creating climate-resilient food systems, ending child marriage or generating the millions of jobs that Africa’s growing youth population needs.
However, how do we monitor and evaluate the impact of our change efforts? Without ways to do so, we cannot learn what is working (and what is not). And without a true understanding of impact, it is difficult to galvanise the resources of time, effort enthusiasm and money that change at scale will require.
Traditional monitoring, evaluation and learning (MEL) practices in development have predominantly been built to serve funders keen to ‘prove’ the efficacy of their work and hold partners accountable. They have rarely been designed to serve those who live within the complex world to be changed.
Often, they have been designed for a mythical static and linear world where impact can be attributed to isolated specific activities. Such approaches pay scant regard to the activity (intentional or otherwise) of other actors, and don’t adapt well to the characteristics of complex problems such as emergence, uncertainty and interdependency.
Systems MEL uses traditional tools; but uses them in different ways with different objectives
Learning about the ways systems are changing is not determined by the data collection tools we use, but by the things we pay attention to, the questions we ask how and when, and who we want the learning to serve.
There are several core principles that underpin systems MEL approaches:
- Serve the problem – MEL approaches designed primarily in service of funders and accountability can be extractive in nature, focused on holding partners accountable and producing learning in forms that are either private or difficult to access. In contrast, MEL-orientated systems change should be useful to a wide audience of people, both those who live with, and those who work on, the issue. It should help people better understand what is going on, how change is happening and where and why efforts are getting stuck.
- Integrate MEL in strategy and implementation – traditional MEL approaches are often structured around specific reporting milestones, evaluations tend to happen upon completion of program activities or milestones, and M&E managers may operate in isolation from strategy and implementation. One key objective of systems MEL is to produce data and learnings that support real-time decision-making. To achieve this, we need a close and ongoing relationship between strategy, learning and implementation roles and responsibilities.
- Pay attention to activity and context beyond a single intervention – traditional MEL is often focused on trying to isolate and measure the (expected) impact of a single project or intervention. But when an intervention environment is complex, it means straightforward cause-and-effect relationships are uncommon. Systems change is never the result of a single intervention by a single actor. Therefore, systems-based MEL looks at the context beyond the parameters, trying to capture what is emerging with a focus on both expected and unexpected changes as well as looking at specific interventions and the wider systems context.
- Reduce asymmetries of information and knowledge – systems that are working poorly for some and well for others typically have strong asymmetries of information. With the most marginalised often having the least access to knowledge about the issues that affect them the most. A MEL approach that compares the efficacy of early flood warning systems across a number of communities may share that information with the providers or funders of those systems but not necessarily the affected communities. Whom, if they had access to that insight, may be able to make their own adjustments to how they use and engage with the early warning systems available to them. Systems MEL holds an open mind to who is a producer and who is a consumer of learning, and therefore intentionally puts learning back into the systems in ways that are accessible to all affected actors and redress rather than exacerbate asymmetries.
Here at Wasafiri we work with a range of clients and partners on systems-based monitoring, evaluation and learning such as the World Economic Forum’s Platform of Global Public Goods (PGPG). We created theories of change and metrics to track progress across various initiatives, from improving ocean health to preventing violent extremism in East Africa.
We also worked with Jobtech Alliance who employed a systems change framework focusing on collaborative behaviours, practical interventions supporting digital platforms, and ecosystem-level influence to create quality jobs in Africa.
We are developing an open-access approach to MEL for systems change based on our own practice. Currently, we are seeking an initial round of feedback from partners and friends. If you would be curious to learn more and offer some friendly advice, please reach out to stella@wasafirihub.com. Once we have further iteration, we will be sharing publicly and running a number of live discussion sessions. So follow us on LinkedIn to sign up when we go live.
Here are some links to other good folks doing good work in this space:
- Momentum knowledge Accelerator, Guide to complexity-aware monitoring approaches (USAID, 2020)
- Human Learning Systems – a practical guide for the curious. Centre for Public Impact. 2022.
- How to set up and manage an adaptive programme, Lesson from the Action on Climate Today programme, OPM, Katherine Cooke. 2017.
- Making adaptive rigour work – Principles and practices for strengthening monitoring, evaluation and learning for adaptive management. Ben Ramalingam, Leni Wild and Anne L. Buffardi. The Policy Practice. 2019.
- LearnAdapt: lessons from three years of adaptive management. ODI.
- Course: Introduction to Collaborating, Learning and Adapting in the Program Cycle. USAID Learning Lab.
- Collection of resources: UNDP’s initiative on Systems Monitoring, Learning & Evaluation (SMLE)