• Using Contribution analysis in evaluating development projects

    AISE Consulting recently evaluated complex advocacy and health systems strengthening projects in Nigeria and Kenya.

  • Women’s Rights Advancement and Protection Alternative (WRAPA)

    AISE Consulting  has won a consultancy bid with the Women’s Rights Advancement and Protection Alternative (WRAPA), a National Women’s Organisation in Nigeria to conduct a Baseline survey on Mitigating COVID-19 impacts on Girl’s education for their Recovery and Resilience Project.

  • Deborah Delgado Pugley: Study on responses to COVID-19 in Peru

    AISE is proud of our Conservation and Climate change expert, Deborah Delgado Pugley, PhD who won a research grant from the Social Research Council to conduct a study on indigenous peoples networked response to COVID-19 in the Peruvian amazon.

  • Assessment, Innovation, Strategy and Evaluation

Using Contribution analysis in evaluating development projects

By Emilomo Ogbe, Founder of AISE Consulting

AISE Consulting recently evaluated complex advocacy and health systems strengthening projects in Nigeria and Kenya. Given the complexity of the interventions, we were tasked with the decision to use a contribution analysis approach instead of a standard impact evaluation approach. Our choice in proposing this approach to a client is often fraught with trepidation because some clients are more concerned with how the intervention contributed to outcomes like ‘what worked, “for whom’ and if not. Clients want to know how their intervention directly impacted outcomes.

It is a real win when we find clients who understand the complexity of various contexts while agreeing to collaboratively develop an evaluation framework. In this way, we meticulously explore their work process, actions taken and their role in the final outcomes of their program. In this way, there is the acknowledgment of the multiple actors and influences that influenced the desired outcomes positively or detracted from them.


Contribution analysis is a methodology used to identify the contribution a development intervention has made to a change or set of changes. The aim is to produce a credible, evidence-based narrative of contribution that a reasonable person would be likely to agree with, rather than to produce conclusive proof. Contribution analysis can be used during a development intervention, at the end, or afterwards. From: Contribution-analysis.pdf (intrac.org)


The contribution analysis approach is widespread, especially for complex multi-layered interventions; where multiple factors play a role. With this approach, it is quite difficult to link causality and impact to a stand-alone intervention or action. In the text box above, there is a clear-cut definition of contribution analysis with links to the steps required for adequate contribution analyses.

After deciding with a client to take the step towards the contribution analysis paradigm, there are key steps to be followed in preparation for the difficult journey ahead:


1. The Inception phase is particularly important.

Envision this stage as a collaborative process instead of a platform to finalize your work plan and budget. Use this process to identify the external stakeholders to engage, the details within your evaluation, and the internal stakeholders within your client’s organization who are open to reflecting on the program’s strengths and weaknesses. This ensures effective facilitation of the evaluation process and engagement with relevant stakeholders. Evaluations can be a vulnerable time for many clients, as they are forced to pause and reflect on the challenging work they have been doing for years while questioning their processes and value-add. Recognize and acknowledge this struggle and try to be kind. In so doing, you can frame the evaluation as a learning continuum while creating an enabling and safe environment for you and your clients to thrive.


2.  Always involve and work closely with your client.

This is an intelligent way to decide what outcomes and processes you will be reviewing, including any existing information gaps. Although it is an independent evaluation, no one knows the project and processes like your client. Listen, learn, and work with the client while juxtaposing their experience with your technicalities to develop sustainable solutions. Remember, an award-winning evaluation report is not useful if the client cannot relate to it or utilize the findings in improving their work processes. The ability to integrate recommendations into the next phase of their work is key for clients. For example, at AISE Consulting, we use the core principles of Utilization-based evaluation developed by Michael Quinn Patton.


Utilization-Focused Evaluation (UFE), developed by Michael Quinn Patton, is an approach based on the principle that an evaluation should be judged on its usefulness to its intended users.  Therefore evaluations should be planned and conducted in ways that enhance the likely utilization of both the findings and of the process itself to inform decisions and improve performance. From: Utilization-Focused Evaluation | Better Evaluation


3. Interrogate the thought process behind the program or intervention.

If a program theory or Theory of Change exists, conduct a participatory workshop where you and your client jointly interrogate the causal pathways and assumptions. The goal should be to identify the specific interventions or processes that were implemented at each step of the causal pathway. Do more than collect the data and discuss specific program outcomes. Encourage your clients to tell their stories - listen to and document their narratives. Often, clients will use program-laden language to describe activities, thus omitting nuances or informal encounters and communications, that played a huge role in shifting behaviors and attitudes. These may have been the factors that led to a more enabling environment for the program or intervention.


4. Acknowledge the temptation to reverse.

Recognize that despite choosing to go this way, you will be tempted to revert to impact evaluation terms and language. This may not have been your intention, or your client’s either. Old habits die hard. Be clear about what the contribution analysis can reveal and what it cannot. Work closely with your client at the inception phase to articulate these expectations and identify any secondary data that might be needed to provide client specifications for their funders.


5. Validate the preliminary findings through collaboration.

In a participatory way, engage first with your clients and the leadership within their organization. Their buy-in and feedback are crucial to ensuring that the recommendations and findings can be integrated into the organization and the program. Once you have done this, engage with key program and field staff in the second round of validation. In this way you would have captured all the key issues, gaps, and concerns. If you can, share your findings with external partners or stakeholders; particularly if your client has identified their input as essential to the process.

6. Finally, remember that the evaluation process is meant to be a learning experience for your client. Make sure they are core to the process and at the center of your evaluation trajectory, as well as the target audience!



AISE provides and contributes to ethical and sustainable solutions to development and health system reforms in low- and middle-income countries. We do this by providing our clients with opportunities to co-create interventions, drawing out evidence-based solutions through evaluations and implementing them in a sustainable manner.