As you review grantee reports, you may want to consider some key questions which we have outlined below. Feel free to utilize the proposed template responses and customize as needed.
Report Section |
Key Questions |
Template Response Options |
Description: |
Is the report description sufficient? If you were not previously aware of the program, would this description provide you with enough information about what the program does and what it’s trying to achieve? |
|
Dates: |
Do the report dates match the content in the report? |
|
Location: |
Are the locations listed correctly? These should represent where the implementation occurs. |
|
Activities: |
||
Impact Summary: Impact Model |
Is an appropriate Cause Category and impact model selected? |
|
Impact Summary: Impact Model |
Is the end beneficiary appropriately identified? |
|
Impact Summary: Program Development |
Ensure that the Program Development indicators are representative of internal capacity building / organizational efforts related to supporting this program. These indicators should describe how the program itself is strengthened in order to provide services to beneficiaries. The Program Development section is optional. |
|
Impact Summary: Learn & Act |
The Learn and Act sections are optional. Does the report need these sections but they are missing? Or more commonly, does the report include them but they are unnecessary and feel duplicative of other information included in other sections? Not all programs need to include indicators capturing behavior change, such as some education models. |
|
Impact Summary: Social Impact |
Based on your understanding of the program through the program description and intervention section, are there any Social Impact indicators missing from the logic model? Does the logic model sufficiently capture the impacts of the program and illustrate how lives are improved? |
|
Impact Summary: Indicator Definition |
Have they adequately defined the indicator under the “Success Criteria?” This section should provide you with a clear threshold for success to understand the indicator and how they are measuring success in their work. If needed, backup documentation can be attached to provide further context. |
|
Impact Details: Forecast |
Are the forecasted or final indicator values reflective of the outcomes to be achieved/achieved during the reporting period designated on the report? Recall, the reporting period, indicator values, and budget should all align so that a cost per outcome can be calculated. |
|
Impact Details: Measurement Category |
Are the selected measurement categories (Direct Measurement, Estimated Based on Data, Guess) appropriate? Have they adequately described their measurement approach under the “Sources/Assumptions?” Is backup documentation provided if needed? All indicators in Program Development should be Directly Measured. |
|
Beneficiaries |
Do the selected demographics include populations indicated by the grant agreement? Where possible/relevant, we encourage reports to include gender, socio-economic status, ethnic background, and age along with a brief written narrative about their beneficiaries. |
|
Budget |
Is the budget representative of the entire program during this reporting period? If the budget value is the same as the investment amount, this suggests that this investment is the only donor to the program. |
|
Narratives (final reports only) |
For final reports, there’s an option to include a brief lesson learned, success story, and photo. Where possible, encourage these sections to be completed. |
|
Custom Questions (standard reports don’t include these) |
If the report has any custom questions included as designated by the donor, ensure that anything required has been completed. |
|