Closing date: 20 June 2021
Duty Station: Juba, South Sudan
Purpose of consultancy:
The Peacebuilding fund project, “Protecting women and girls in South Sudan: Addressing gender based violence (GBV) as catalyst for peace”, is implemented by United Nations Development Programme (UNDP), United Nations Population Fund (UNFPA), United Nations Children’s Fund (UNICEF) and United Nations Entity for Gender Equality and Women’s Empowerment (UN Women) since September 2018. The project is part of the United Nations Joint Programme (JP) on prevention of and addressing gender-based violence (2017-2020) supported by Peacebuilding fund. The project runs for the period, September 2018- August 2021. The programme aims to address the alarming situation of GBV gender based violence in South Sudan through an integrated approach to achieve increase empowerment of women in South Sudan by strengthening prevention mechanisms for GBV and transforming harmful social norms into positive behaviour that promotes gender equality in Bor, Aweil and Akobo. This entails, (i) provision of life-saving integrated services through One Stop Centres/ Family Protection Centres in Aweil and Akobo, (ii) strengthening women’s groups participation in local peace processes in Bor,Aweil and Akobo and increase civic engagement on GBV Prevention and response, (iii) increasing access to justice mechanisms for the GBV survivors, and (iv) transforming communities’ harmful social norms that contribute to GBV into positive norms that promote gender equality.
The PBF project stipulates to conduct an end of project evaluation towards the end of the project. In this respect, the evaluation will be implemented in line with the United Nations Evaluation Group (UNEG) norms and standards and international good practice for evaluation. It offers step-by-step guidance to prepare methodologically robust evaluations and sets out the roles and responsibilities of key evaluation stakeholders at all stages in the evaluation process.
The main audience and primary users of the evaluation are: (i) The participating UN agencies for the project; (ii) Government of South Sudan (iii) the United Nations Country Team (UNCT) in South Sudan; (iv) the donors operating in South Sudan. The evaluation results will also be of interest to a wider group of stakeholders, including: (i) Implementing partners of the joint project; (ii) UN participating agencies in regional offices and headquarters divisions, branches and offices; (iii) academia; (iv) local civil society organizations and international NGOs; and (v) beneficiaries of UN support (in particular women and adolescents and youth). The evaluation results will be disseminated to these audiences as appropriate, using traditional and new channels of communication and technology.
Purpose
The end of the project evaluation will serve the following three main purposes: (i) demonstrate accountability to stakeholders on performance in achieving development results and on invested resources; (ii) support evidence-based decision making on the progress made against the results DocuSign Envelope ID: E54DBCA3-41DE-4E61-9024-792F382F2DDE outlined; and (iii) contribute key lessons learned to the existing knowledge base on how to carry forward the gains made through the intervention.
Objectives
The objectives of this evaluation are:
• to provide the project participating agencies, national stakeholders with an independent assessment of the performance of the project and key lessons learned, based on the relevance, effectiveness, efficiency and sustainability of the project towards the expected outputs and outcomes set forth in the results framework.
• Draw key lessons from past and current cooperation and provide a set of clear and forward-looking options leading to strategic and actionable recommendations for the next project.
Geographic Scope
The evaluation will cover interventions at the national level and in locations Bor, Akobo and Aweil.
Thematic Scope
The evaluation will cover the following thematic areas as outlined under the four outcomes envisaged as per the results framework. In addition, under the evaluation criteria noted above, the evaluation will cover cross-cutting issues such as gender equality, human rights and peacebuilding, monitoring and evaluation (M&E), communications, innovation; resource mobilization and strategic partnerships.
Scope of work
During the term of this consultancy, the team of One International and one National Consultant are expected to work as a team, where the International Consultant serve as the Team Lead, to:
• Develop the methodology and tool for conducting the evaluation in collaboration with the Evaluation Reference Group (ERG), especially the Evaluation Manager, participating agency focal points, Residence Coordinator’s Office and M&E focal points of each participating agency
• Inception report detailing methodology to assess the functionality of the OSC
• Conduct and facilitate the evaluation across the 3 field locations of project implementation
• Liaison with respective line departments, development partners, UN agencies and civil society organizations to interview the stakeholders and receive their inputs and recommendations
• Prepare the draft evaluation report based on the field missions and interviews with the stakeholders
• Prepare and present final report of evaluation findings and recommendation
• Prepare a presentation of (i) the evaluation including effectiveness, efficiency, relevance, coherence and sustainability criteria • Do any other work that may be agreed with the ERG.
2. Methodology and Approach
2.1 Evaluation Approach
Theory of change-based approach
The evaluation will adopt a theory-based approach that relies on an explicit theory of change, which depicts how the interventions supported by the project are expected to contribute to a series of results (outputs and outcomes) that lead to the overall goal of the project.
The theory of change will play a central role throughout the evaluation process, from the design and data collection to the analysis and identification of findings, as well as the articulation of conclusions and recommendations. The evaluation team will be required to verify the theory of change and use this theory of change to determine whether changes at output and outcome levels occurred (or not) and whether assumptions about change hold true. The analysis of the theory of change will serve as the basis for the evaluators to assess how relevant, effective, efficient and sustainable the support provided by the project.
Participatory approach
The evaluation will be based on an inclusive, transparent and participatory approach, involving a broad range of partners and stakeholders at national and sub-national levels. These stakeholders include: representatives from government, civil society organizations, implementing partners, the private sector, academia, other United Nations organizations, donors and beneficiaries (in particular women and girls, adolescents/youth and men). They can provide insights and information, as well as referrals to data sources that the evaluators should use to assess the contribution of the project.
The Evaluation Manager in the UNFPA South Sudan CO will establish an ERG comprised of key stakeholders including: governmental and non-governmental counterparts at national level, Implementing Partners, staff from the Technical Working Group of UN agencies. The ERG will provide inputs at different stages in the evaluation process.
Mixed-method approach
The evaluation will primarily use qualitative methods for data collection, including document review, interviews, group discussions and observations through field visits, as appropriate. The qualitative data will be complemented with quantitative data to minimize bias.
2.2. Methodology
The evaluation team shall develop the evaluation methodology in line with the evaluation approach and guidance provided in accordance with the UNEG Norms and Standards for Evaluation , Ethical Guidelines for Evaluation , Code of Conduct for Evaluation in the UN System , and Guidance on Integrating Human Rights and Gender Equality in Evaluations . When contracted, the evaluators will be requested to sign the UNEG Code of Conduct prior to starting their work.
The methodology that the evaluation team will develop builds the foundation for providing valid and evidence-based answers to the evaluation questions and for offering a robust and credible assessment. The methodological design of the evaluation shall include in particular: (i) a theory of change; (ii) a strategy for collecting and analyzing data; (iii) specifically designed tools for data collection and analysis; (iv) an evaluation matrix; and (v) a detailed work plan.
Finalization of the evaluation questions and assumptions
Based on the preliminary evaluation questions presented in the present terms of reference (see section 5.2), the evaluators are required to finalize the set of questions that will guide the evaluation. The final set of evaluation questions will need to clearly reflect the evaluation criteria and key areas of inquiry (highlighted in the preliminary evaluation questions). The evaluation questions should also draw from the theory of change. The final evaluation questions will structure the evaluation matrix and shall be presented in the design report.
Sampling strategy
The participating agencies of the project (UNDP, UNFPA, UN Women, UNICEF) will provide an initial overview of the interventions, the locations where these interventions have taken place, and the stakeholders involved in these interventions.
Based on information gathered through desk review and discussions with the participating agencies, the evaluators will refine the initial stakeholders map and develop a comprehensive stakeholders map. From this stakeholders map, the evaluation team will select a sample of stakeholders at national and sub-national levels who will be consulted through interviews and/or group discussions during the data collection phase. These stakeholders must be selected through clearly defined criteria and the sampling approach outlined in the design report. In the design report, the evaluators should also make explicit what groups of stakeholders were not included and why. The evaluators should aim to select a sample of stakeholders that is as representative as possible.
The evaluation team comprising of two members (One International Team Lead, One national Team member) shall also select a sample of sites that will be visited for data collection, and provide the rationale for the selection of the sites in the design report.
Data collection
The evaluation will consider primary and secondary sources of information. Primary data will be collected through semi-structured interviews with key informants at national and sub-national levels (government officials, representatives of implementing partners, civil society organizations, other United Nations organizations, donors, and other stakeholders), as well as group discussions with service providers and beneficiaries (women and adolescents and youth) and direct observation during visits to programme sites.
The evaluation team is expected to dedicate a total of approximately twelve days (12 days) for data collection in the field. The data collection tools that the evaluation team will develop, which may include protocols for semi-structured interviews and group discussions, a checklist for direct observation at sites visited or a protocol for document review, shall be presented in the design report.
Validation mechanisms
All findings of the evaluation need to be firmly grounded in evidence. The evaluation team will use a variety of mechanisms to ensure the validity of collected data through systematic triangulation of data sources and data collection methods, regular exchange with the Technical Working Group of the project; internal evaluation team meetings to share and discuss hypotheses, preliminary findings and conclusions; and the debriefing meeting with the UNCT and the ERG at the end of the field phase where the evaluation team present the preliminary findings and emerging conclusions.
Additional validation mechanisms may be established, as appropriate.
3. Evaluation Process
The evaluation can be broken down into five different phases that include different stages and lead to different deliverables: preparatory phase; design phase; field phase; reporting phase; and facilitation of use and dissemination phase.
3.1. Preparatory Phase
The preparatory phase includes:
● Establishment of the ERG.
● Drafting the terms of reference (ToR), and approval of the draft ToR by the Participating UN agencies
● Selection of two consultants (one International, one national),
● Compilation of background information and documents for desk review by the evaluation team.
● Preparation of a first stakeholders map.
3.2. Design Phase
The evaluation team will conduct the design phase in consultation with the Evaluation Manager and the ERG. This phase includes:
● Desk review of initial background information as well as other relevant documentation.
● Review and refinement of the theory of change
● Formulation of a final set of evaluation questions based on the preliminary evaluation questions provided in the ToR.
● Development of a comprehensive stakeholders map and sampling strategy to select sites to be visited and stakeholders to be consulted in South Sudan through interviews and group discussions.
● Development of a data collection and analysis strategy, as well as a concrete work plan for the field and reporting phases
● Development of data collection methods and tools, assessment of limitations to data collection and development of mitigation measures.
● Development of the evaluation matrix (evaluation criteria, evaluation questions, assumptions, indicators, data collection methods and sources of information).
At the end of the design phase, the evaluation team will develop a design report that includes the results of the above-listed steps and tasks.
3.3. Field Phase
The evaluation team will undertake a field mission to project sites to collect the data required to answer the evaluation questions. Towards the end of the field phase, the evaluation team will also conduct a preliminary analysis of the data to identify emerging findings and conclusions to be validated with the ERG. The field phase should allow the evaluators sufficient time to collect valid and reliable data. While a period of 12 days is recommended, the Evaluation Manager will determine the optimal duration of the field mission in consultation with the evaluation team during the design phase. The field phase includes:
● Meeting with the participating agencies to launch the data collection.
● Meeting of evaluation team members with relevant programme officers
● Data collection at national and sub-national levels.
At the end of the field phase, the evaluation team will hold a debriefing meeting with the ERG to present the preliminary findings and emerging conclusions from the data collection. The meeting will serve as an important validation mechanism and will enable the evaluation team to develop credible and relevant findings, conclusions and recommendations.
3.4. Reporting Phase
In the reporting phase, the evaluation team will continue the analytical work (initiated during the field phase) and prepare a draft evaluation report, taking into account the comments and feedback provided by the ERG at the debriefing meeting at the end of the field phase.
This draft evaluation report will be submitted to the ERG for quality assurance purposes. Prior to the submission of the draft report, the evaluation team must ensure that it underwent an internal quality control against the criteria outlined in the Evaluation Quality Assessment (EQA). The M&E Advisors/ focal points of each participating agency will play a role in quality assurance.
The Evaluation Manager will collect and consolidate the written comments and feedback provided by the members of the ERG. On the basis of the comments, the evaluation team should make appropriate amendments, prepare the final evaluation report and submit it to the ERG. The final report should clearly account for the strength of evidence on which findings rest to support the reliability and validity of the evaluation. Conclusions and recommendations need to clearly build on the findings of the evaluation. Conclusions need to clearly reference the specific evaluation questions from which they have been derived, while recommendations need to reference the conclusions from which they stem.
The evaluation report is considered final once it is formally approved by all the four participating agencies.
3.5. Facilitation of Use and Dissemination Phase
In the facilitation of use and dissemination phase, the evaluation team will develop a PowerPoint presentation and evaluation brief for the dissemination of the evaluation results that conveys the findings, conclusions and recommendations of the evaluation in an easily understandable and user-friendly way.
Roles and responsibilities of Evaluation Team Lead
Evaluation team leader
The evaluation team leader will hold the overall responsibility for the design and implementation of the evaluation. She/he will be responsible for the production and timely submission of all expected deliverables in line with the ToR. She/he will lead and coordinate the work of the evaluation team and ensure the quality of all deliverables at all stages of the evaluation process. The Evaluation Manager will provide methodological guidance to the evaluation team in developing the design report, in particular, but not limited to, the evaluation approach, methodology, work plan and agenda for the field phase, the draft and final evaluation reports, and the PowerPoint presentation of the evaluation results. She/he will lead the presentation of the design report and the debriefing meeting with the ERG at the end of the field phase. The Team leader will also be responsible for liaising with the Evaluation Manager and participating agencies and RCO.
Quality Assurance
The ERG is responsible to ensure the quality assurance and quality assessment. While quality assurance occurs throughout the evaluation process and covers all deliverables, quality assessment takes place following the completion of the evaluation process and is limited to the final evaluation report only.
The evaluation team leader also plays an important role in undertaking quality assurance. The evaluation team leader must ensure that the evaluation team member provides high-quality contributions and that the deliverables submitted comply with the quality assessment criteria outlined below. The evaluation quality assessment checklist (see below), is used as an element of the proposed quality assurance system for the draft and final versions of the evaluation report.
1. Structure and Clarity of the Report
To ensure report is user-friendly, comprehensive, logically structured and drafted in accordance with international standards.
2. Executive Summary
To provide an overview of the evaluation, written as a stand-alone section including key elements of the evaluation, such as objectives, methodology and conclusions and recommendations.
3. Design and Methodology
To provide a clear explanation of the methods and tools used, including the rationale for the methodological approach. To ensure constraints and limitations are made explicit (including limitations applying to interpretations and extrapolations; robustness of data sources, etc.)
4. Reliability of Data
To ensure sources of data are clearly stated for both primary and secondary data. To provide explanation on the credibility of primary (e.g. interviews and group discussions) and secondary (e.g. reports) data established and limitations made explicit.
5. Findings and Analysis
To ensure sound analysis and credible evidence-based findings. To ensure interpretations are based on carefully described assumptions; contextual factors are identified; cause and effect links between an intervention and its end results (including unintended results) are explained.
6. Validity of Conclusions
To ensure conclusions are based on credible findings and convey evaluators’ unbiased judgment of the intervention. Ensure conclusions are prioritized and clustered and include: summary, origin (which evaluation question(s) the conclusion is based on), and detailed conclusions.
7. Usefulness and Clarity of Recommendations
To ensure recommendations flow logically from conclusions, are targeted, realistic and operationally feasible, and are presented in order of priority. Recommendations include: summary, priority level (very high/high/medium), target (administrative unit(s) to which the recommendation is addressed), origin (which conclusion(s) the recommendation is based on), and operational implications.
8. SWAP - Gender
To ensure the evaluation approach is aligned with SWAP (guidance on the SWAP Evaluation Performance Indicator and its application to evaluation can be found at http://www.unevaluation.org/document/detail/1452 - UNEG guidance on integrating gender and human rights more broadly can be found here: http://www.uneval.org/document/detail/980).
Duration and working schedule:
The consultant is expected to fulfil the above tasks over a period of two months (49 working days)
An agreement on the activities and timeline shall be reached between the Evaluation Reference and the Consultant including field visits, meetings, consultations.
Once the evaluation team leader has been recruited, she/he will develop a detailed work plan in close consultation with the Evaluation Manager and ERG.
• Inception report which includes a background, key tasks, approach to complete key tasks, key deliverable and a work plan with time frame not exceeding 10 pages
• Draft reports (not more than 50 pages) with references.
• Final report with executive summary, conclusion and key recommendations.
Monitoring and progress control, including reporting requirements, periodicity format and deadline:
The International Consultant will serve as the team lead for the assessment and ensuring the delivery of the agreement.
The final deliverables will be approved by ERG before final payment
Expected Deliverables
The evaluation team is expected to produce the following deliverables:
● Design report. The design report should translate the requirements of the ToR into a practical and feasible evaluation approach, methodology and work plan. It should include (at a minimum): (i) a stakeholders map; (ii) an evaluation matrix (incl. the final set of evaluation questions, indicators, data sources and data collection methods); (iii) the evaluation approach and methodology, with a detailed description of the agenda/timeline for the field phase; (iv) and data collection tools and techniques (incl. interview and group discussion protocols).
● PowerPoint presentation of the design report. The presentation will be delivered at an ERG meeting to present the contents of the design report and the agenda for the field phase. Based on the comments and feedback of the ERG, the Evaluation Manager and the Regional M&E Adviser, the evaluation team will develop the final version of the design report.
● An evaluation brief outlining the methodology, findings and recommendations of the evaluation
● PowerPoint presentation for debriefing meeting with the ERG, M&E Working Group of UNCT, PMT and UNCT. The presentation provides an overview of key preliminary findings and emerging conclusions of the evaluation. It will be delivered at the end of the field phase to present and discuss the preliminary evaluation results with UNCT and the members of the ERG.
● Draft and final evaluation reports. The final evaluation report (maximum 70 pages plus annexes) will include evidence-based findings and conclusions, as well as a full set of practical and actionable recommendations to inform the next project cycle. A draft report precedes the final evaluation report and provide the basis for the review of the participating agencies, ERG members, the Evaluation Managers of participating agencies. The final evaluation report will address the comments and feedback provided by the stakeholders.
● PowerPoint presentation of the evaluation results. The presentation will provide an overview of the findings, conclusions and recommendations to be used for dissemination purposes.
All the deliverables will be developed in English language.
The consultant will report to the UNFPA Gender Specialist.
Expected Travel: Within Juba South Sudan and Aweil, Bor and Akobo field mission.
Required Expertise and Qualifications
Education:
● Master’s degree in Social Sciences, International Studies, Gender studies, Peace building and conflict resolution, Development Studies or a related field.
Knowledge and Experience:
● 7 years of experience in conducting or managing evaluations in the field of international development and peace building.
● Experience in leading evaluations commissioned by United Nations organizations and/or other international organizations and NGOs.
● Demonstrated expertise in one of the thematic areas of programming covered by the project, especially peacebuilding programming
● In-depth knowledge of theory-based evaluation approaches and ability to apply both qualitative and quantitative data collection methods and to uphold standards for quality evaluation as defined by UNEG.
● Good knowledge of peacebuilding and GBV strategies, policies, frameworks and principles as well as the international humanitarian architecture and coordination mechanisms.
● Ability to ensure ethics and integrity of the evaluation process, including confidentiality and prevention of harm to evaluation subjects.
● Ability to consistently integrate human rights, peace building and gender perspectives in all phases of the evaluation process.
● Excellent management and leadership skills to coordinate and supervise the work of the evaluation team.
● Excellent analytical skills and demonstrated ability to formulate evidence-based conclusions and realistic and actionable recommendations.
● Excellent communication (written and spoken), facilitation and knowledge-sharing skills.
● Good knowledge of the national development context of South Sudan
Languages:
● Fluent in written and spoken English, knowledge of Arabic will be an asset.
How to Apply
Please send your application and a short letter of motivation with "PBF END OF PROJECT EVALUATION CONSULTANT" in the subject to KENNETH MUCHIRI, ssco.vacancies@unfpa.org by 20 June 2021