True Impact Report Feedback Guidance
As you review grantee reports, you may want to consider some key questions which we have outlined below. Feel free to utilize the proposed template responses and customize as needed.
|
Report Section |
Key Questions |
Template Response Options |
|
Overview: Report Title |
Does the report title accurately describe the program being reported on? Ensure the report title can be understandable by any stakeholder that may review the report. |
|
|
Overview: Report Description |
Is the report description sufficient? If you were not previously aware of the program, would this description provide you with enough information about what the program does and what it’s trying to achieve? |
|
|
Overview: Investment |
Is the investment amount correct? Is the level (Incremental, Significant, Foundational) appropriate for the partnership? Note that the grantee determines this designation. If there is a discrepancy between the grantee and donor understanding of this funding tier, it may be worth a discussion with the grantee. |
|
|
Overview: Report Dates + Stage |
Does the report stage match the report timeline? Initial = program just beginning, Interim = program underway, Final = program has finished. You may also want to ensure the stage and dates are consistent with the grant agreement. Here’s a helpful article about reporting dates. |
|
|
Overview: Location |
Are the locations listed correctly? These should represent where the implementation occurs. |
|
|
Intervention |
Does the core service and associated information well represent the program? Is there additional narrative description included? |
|
|
Beneficiaries |
Do the selected demographics include populations indicated by the grant agreement? Where possible/relevant, we encourage reports to include gender, socio-economic status, ethnic background, and age along with a brief written narrative about their beneficiaries. More guidance can be found here. |
|
|
Outcomes: Logic Model |
Is an appropriate service/logic model selected? Pay attention to the full logic model presented, not just the name of it. The key is whether the appropriate Social Impact indicators are selected. If a full report has been drafted and the service model should be changed, please contact True Impact for support. This can be complicated for a grantee to navigate and may be easier for True Impact to update. |
|
|
Outcomes: Indicator Definition |
Have they adequately defined the indicator under the “Success Criteria?” This section should provide you with a clear threshold for success to understand the indicator and how they are measuring success in their work. If needed, backup documentation can be attached to provide further context. |
|
|
Outcomes: Measurement Category |
Are the selected measurement categories (Guess, Estimate, Directly Measure) appropriate? Have they adequately described their measurement approach under the “Sources/Assumptions?” Is backup documentation provided if needed? All indicators in Program Development should be Directly Measured. |
|
|
Outcomes: Indicator Alignment |
Are the forecasted or final indicator values reflective of the outcomes to be achieved/achieved during the reporting period designated on the report? Recall, the reporting period, indicator values, and budget should all align so that a cost per outcome can be calculated. |
|
|
Outcomes: Program Development |
Ensure that the Program Development indicators are representative of internal capacity building / organizational efforts related to supporting this program. These indicators should describe how the program itself is strengthened in order to provide services to beneficiaries. The Program Development section is optional. |
|
|
Outcomes: Reach |
Are the forecasted or actual indicator values aligned to the measurement units? It can help to check the Success Criteria against the indicator definition to ensure this alignment. For example, most indicators in the Reach section should be measured in people units (not # businesses, dollars, etc.). There are some exceptions such as trees planted or meals distributed. |
|
|
Outcomes: Learn + Act |
The Learn and Act sections are optional. Does the report need these sections but they are missing? Or more commonly, does the report include them but they are unnecessary and feel duplicative of other information included in other sections? Not all programs need to include indicators capturing behavior change, such as some education models. |
|
|
Outcomes: Social Impact |
Based on your understanding of the program through the program description and intervention section, are there any Social Impact indicators missing from the logic model? Does the logic model sufficiently capture the impacts of the program and illustrate how lives are improved? |
|
|
Outcomes: Custom Indicators |
Ideally, encourage the organization to select an existing template indicator rather than defining a custom indicator. This allows for clarity of impact across a portfolio. |
|
|
Documentation |
If needed, ensure backup documentation is attached. They are not required. There are several opportunities for documentation uploads: within specific indicators, at the end of the Outcome section, and at the end of the Budget section. |
|
|
Budget |
Is the budget representative of the entire program during this reporting period? If the budget value is the same as the investment amount, this suggests that this investment is the only donor to the program. |
|
|
Narratives (final reports only) |
For final reports, there’s an option to include a brief lesson learned, success story, and photo. Where possible, encourage these sections to be completed. |
|
|
Custom Questions (standard reports don’t include these) |
If the report has any custom questions included as designated by the donor, ensure that anything required has been completed. |
|