Key takeaway
The pre-submission checklist is the last quality control before you click submit. CDP scoring partners read your response start to finish. Anything broken, inconsistent, or missing at this stage costs you points. This lesson is the operational checklist used by Leadership-tier teams: 30 specific items to verify before you commit to submission. Use it as your own pre-flight check.
Why the checklist matters
After you click submit, your response is locked. There is no edit window. The scoring partner reads what you submitted; clarification requests are limited.
Companies with strong scores treat the final 48 hours before submission as a structured review. Companies with weak scores treat it as a frantic clean-up. The same content can score very differently depending on which path is taken.
The 30-item pre-submission checklist
Module-by-module completeness
- Module 1 (Introduction): all questions answered, sector classification confirmed, financial filing references match audited financials, value chain description consistent with Module 7
- Module 2 (IRO): time horizons set, substantive thresholds quantitative, methodology described
- Module 3 (Risks and Opportunities): at least 3 risks and 3 opportunities, all with quantified financial impact, all linked to time horizons from Module 2
- Module 4 (Governance): Q4.1 to Q4.12.1 all completed, board oversight cadence specified, compensation alignment disclosed, trade association list with positions
- Module 5 (Strategy): scenario analysis described, transition plan documented, target details specified with SBTi status, internal carbon pricing if applicable
- Module 6 (Consolidation): approach stated and consistent with Module 1
- Module 7 (Climate): Scope 1, Scope 2 (location and market), Scope 3 (multiple categories) all completed; methodology disclosure detailed; verification statement attached
- Module 8 (Forests): commodity-by-commodity disclosure for material commodities; targets and engagement; cut-off date specified
- Module 9 (Water): withdrawals by basin, stress overlay, risks, targets
- Module 10 (Plastics): polymer-level reporting, recycled content, recyclability, EPR status (if applicable)
- Module 11 (Biodiversity): LEAP outputs, dependencies and impacts, geographic exposure
- Module 12 (Sector-specific, FS or other): all sector questions completed
- Module 13 (Sign-off): all sign-off questions completed; signatory listed; date confirmed
Data and methodology consistency
- Boundary consistency: the same consolidation approach used across all questions where boundary applies
- Currency consistency: all financial figures in the same currency (or with explicit conversion rate cited)
- Time period consistency: all data refers to the same reporting year (with prior years explicitly labelled)
- Methodology consistency: the same emission factors, GWPs, and calculation approaches applied across the response
- Numerical reconciliation: Module 7 emissions number matches the verification statement; targets in Module 5 use the same baseline as Module 7
Evidence and attachments
- Verification statement attached: PDF uploaded; statement signed by the verifier; coverage clearly stated
- Other key evidence attached: sustainability report, transition plan document, board sustainability committee charter, materiality assessment matrix
- Public references current: all referenced URLs (your sustainability website, policies, certifications) are live and current
- No broken links: sample-test 5-10 URLs across the response
Cross-module coherence
- Sector classification consistent: the sector you selected in Module 1 is reflected in the questions you answered in subsequent modules
- Board governance consistent: Module 4 governance disclosures align with Module 5 strategy disclosures
- Engagement coverage consistent: suppliers engaged for emissions data in Module 7 match suppliers engaged for biodiversity in Module 11
- Pathway-consistent: if you are on Supply Chain pathway, the engagement disclosure reflects this; if on RE100, the energy disclosure reflects RE100 status
Disclosure preferences
- Public/private toggle confirmed: intentional decision documented; if private, decision approved at the right level (often CFO or CSO)
- Confidentiality flags: any specific data points marked confidential where appropriate (typically future capex plans, M&A diligence, sensitive supplier identities)
Sign-off and submission
- Sign-off collected: the right person has approved (typically CEO and CFO, with board representative for governance and strategy)
- Final ORS check: all modules show "complete" status; submit button is enabled; you are within the deadline window
How to run the checklist
The recommended cadence:
- 2 weeks before deadline: First pass through the checklist by the sustainability lead. Identify gaps. Schedule remaining work.
- 1 week before: Second pass. Verify all items. Address any remaining issues.
- Deadline week, day 1-2: Third pass. Sign-off collection. Final review by CEO and CFO.
- Deadline week, day 3-5: Submit. Last item on the checklist (the submit confirmation) closes the project.
A specific worked check
Worked example
FoodCo Ltd's checklist run, two weeks before deadline:
-
Item 7 (Climate): Verification statement uploaded? Yes (KPMG India, June 2026 dated). Coverage confirmed? Yes, includes 100 percent of Scope 1+2 + Scope 3 Cat 1, 4, 6, 11. Methodology disclosed in Q7.9? Yes, with factor sources cited.
-
Item 8 (Forests): Palm oil disclosure? Yes, with mill-level traceability for 78 percent of volume. Cocoa disclosure? Yes, with Year 3 progression noted. Cut-off date stated? Yes, 31 December 2020. Target year stated? Yes, 2027.
-
Item 14 (Boundary consistency): Consolidation approach stated as "operational control" in Q1.4. Module 7 disclosures align with operational control. Module 11 (biodiversity) supplier mapping uses same boundary. PASS.
-
Item 18 (Numerical reconciliation): Q7.6 reports Scope 1 = 124,300 tCO2e for FY25. Verification statement says 124,300 tCO2e for the same period. Targets in Module 5 use 2020 as baseline; baseline emissions in Q7.1.1 = 142,500 tCO2e. PASS.
-
Item 22 (No broken links): Spot-checked 8 URLs cited in the response. All live. PASS.
-
Item 27 (Public/private): Set to public. CFO confirmed in writing.
-
Item 29 (Sign-off): CEO and CFO both signed. Board sustainability committee briefed in their May 2026 meeting; minutes confirm.
The team finds 3 issues during the checklist:
- The emission factor source for natural gas was not clearly cited in Q7.9 (added).
- The Module 11 IBAT mapping date was inconsistent with Module 1 (corrected).
- One supplier list referenced in Module 7 had an outdated count (corrected).
These three corrections, caught two weeks before submission, are worth roughly 2-3 points across modules. That is the value of running the checklist.
What happens after submission
The CDP scoring partner reviews your response over the next 3-4 months. In that period:
- You may receive clarification queries from the scorer through the ORS. Respond promptly with documentation. These are an opportunity to add detail, not a sign of trouble.
- You should archive everything locally: a final PDF of the submitted response, all data spreadsheets, all source documents, all attachments. This is the foundation of next year's response and the basis for any post-publication conversation.
- You should notify key stakeholders that submission has happened, with a brief statement on what is coming. Internal: executive committee, board, regional sustainability leads. External: priority investors, key B2B customers, SDG-aligned partners.
When the score lands
The score arrives in December or January. With it comes the scorer feedback report, the most valuable artefact CDP produces. It tells you exactly which questions earned full points, which earned partial, which earned none, and what would have been needed for the next tier on each.
Read it carefully. Quote it in your January internal review. Use it to plan the next year's project.
Many companies expect the score reveal to be a major moment. In practice, the most-discussed part of the score release is often not the letter itself but the scorer feedback report. The letter is a summary; the report is the diagnostic. A team that focuses only on the letter and skips the feedback report leaves money on the table for the next cycle. A team that uses the feedback report to design the next year's project unlocks 1-2 letter improvements over multi-year cycles.
Key Takeaways
- The pre-submission checklist is 30 specific items across module completeness, data consistency, evidence attachments, cross-module coherence, disclosure preferences, and sign-off
- Run the checklist three times: two weeks out, one week out, and deadline week
- Common issues the checklist catches: missing emission factor sources, inconsistent boundary references, broken URLs, outdated counts
- The scorer feedback report (released with the score) is the most valuable diagnostic CDP produces; treat it as the design input for the next cycle
- Companies that treat the post-submission and pre-feedback period as quiet time underperform; A-list responders use it for archival, stakeholder briefing, and next-cycle planning
Knowledge Check
Test what you just learned
6 questions · check each one as you go
After you click submit, can you re-edit your response?
Which document is the most valuable artefact CDP produces for you?
True or false: Saying 'verified' without uploading the verification statement earns full credit.
Which items should the pre-submission checklist cover?
Select all that apply
When should you run the pre-submission checklist?
Match each post-submission activity to its timing.
Match each item to its pair
Archive everything locally
Notify key stakeholders
Receive scorer feedback report
Plan next cycle from feedback
