Mastering CDP Scoring
ESG/Module 2: Foundations and the scoring system/Lesson 4 of 4/6 min read

The ORS, deadlines, and the response lifecycle

Lesson 1.4

Key takeaway

Once you know which pathway applies to you, the next thing to understand is the system you are actually using. The Online Response System (ORS) is CDP's portal where every response is drafted, reviewed, and submitted. The ORS is also where the most preventable mistakes happen: missed deadlines, lost work, conditional questions never answered, and submissions sent in before legal review. This lesson walks the lifecycle from invitation to score release.

The annual cycle at a glance

The CDP year follows a predictable rhythm. Lock these dates into your project plan from day one.

CDP annual disclosure cycle from questionnaire release to score publication

WhenWhat happensWhat you do
January to MarchCDP releases the year's questionnaire and scoring methodologyRead the methodology, identify changes from last year, scope the project
AprilORS opens for responses; invitations sentConfirm pathway, accept invitations, start data collection
May to mid-JuneDrafting periodWrite answers, run internal reviews, gather verification
Late July (typical)Submission deadlineFinal sign-off and submit
August to NovemberCDP scoring partners grade responsesWait, occasionally respond to clarification requests
December or JanuaryLetter grades publishedReceive your score, plan for next year

The submission deadline is firm. Late submissions are not scored.

The ORS interface in three sentences

The ORS is a web application at myportal.cdp.net. You log in, you see your assigned questionnaire, you fill it module by module. Each question can be in three states: blank, draft (unsaved), saved. Only saved answers count at submission, so save constantly.

Once you click submit, the response is locked. You cannot edit afterwards. There is no second-chance window.

The hidden complexity: conditional logic

Most of the ORS does not show you all questions at once. Many questions are gated behind your earlier answers.

Analogy

Think of the ORS like a TurboTax-style interview. If you say you have no children, the form does not ask about childcare expenses. If you say you have a board, you get a series of board oversight questions. If you say you do not have a board, those questions disappear entirely. You are answering a customised questionnaire generated by your earlier answers.

This conditional logic is responsible for two common pitfalls:

  • Phantom questions you never see. You answered "no" to a gating question, and the system hid 15 follow-up questions. If those follow-ups would have scored higher than the points you saved by saying "no," your score is lower than it should be.
  • Late-arriving questions. You answered the strategy module first, then went back to the introduction and changed an industry classification. The system regenerates your questionnaire, and now you have eight new questions you did not see before.

The practitioner heuristic: answer the introduction module first and lock it down. Module 1 (Introduction) decides what the rest of the questionnaire looks like for you. Once it is locked, the rest of the questionnaire does not shift under you.

What "saved" really means

Most companies have a hard lesson here. The ORS is browser-based. Saving requires clicking save. Closing a tab without saving loses your work. Some text fields auto-save, but most do not.

The boring practice that prevents disasters:

  • Draft outside the ORS. Write all answers in a Google Doc or Word file first. Paste into the ORS once you are happy.
  • Save after every cell. Every dropdown, every text field. Click save.
  • Export your draft weekly. ORS lets you export a PDF of your current response. Do this every Friday throughout the project so you have a recoverable copy.

Worked example

A common scenario. A sustainability lead at a Bengaluru-based pharma company spent four hours on a Saturday writing detailed Scope 3 methodology answers directly in the ORS. Browser crashed before the final save. Three hours of work gone. The fix would have been writing in Docs first, pasting last.

This is not a hypothetical. It happens every year, in every cohort, to several companies. The ORS is a stable system, but it is still a browser session.

The roles you need on the project

A clean ORS submission has at least four named roles:

  • Disclosure lead. Owns the response end-to-end, makes the call on edge cases, controls the submit button.
  • Subject matter contributors. Operations, finance, HR, procurement, legal. Each contributes to specific questions but does not own the response.
  • Reviewer. Usually CSO or sustainability head. Reads the full draft before submission, flags inconsistencies.
  • Sign-off authority. CFO, Chief Risk Officer, or board representative. Approves final submission. CDP expects board-level sign-off on key questions, especially in Governance and Strategy modules.

Larger companies add an external assurance provider (EY, KPMG, BDO, Bureau Veritas, etc.) for ISO 14064-3 verification of emissions data. The verification statement is uploaded as an attachment in the climate performance module.

The submission checklist

Before you click the submit button, the operational checklist looks like this:

  • All modules show "completed" status in the ORS sidebar
  • Every required question has a saved answer (not just a draft)
  • All attachments uploaded (verification statements, board minutes, policies)
  • Public-or-private toggle confirmed
  • Sign-off signature collected (board, CFO, CSO depending on company governance)
  • PDF of the full response exported and stored locally
  • Internal communications drafted (investor letter, employee announcement, supply chain customer notifications)

After submission, your response goes to one of CDP's accredited scoring partners (firms like ERM, South Pole, Keramida, ADEC, RyeStrategy). A trained scorer reads your full response, applies the published rubric question by question, and assigns points. This is not automated. Scoring is human, partner-graded, and supervised by CDP's central team. The volume (22,000-plus disclosing companies, around 1,000 questions per response in the most expanded form) is what takes three months. If your score seems off when it lands, you can request a review through CDP's scorer query process.

After the score lands

When your letter is published in December or January, the work is not over. Three things happen in the weeks after:

  • The score gets noticed. Procurement teams at your B2B customers see it. Investors update their ESG screens. ESG raters refresh their corporate ratings.
  • You get a scorer feedback report. This is the most valuable artefact CDP produces for you. It tells you exactly where you lost points and what would have unlocked them. Read it line by line. It is the blueprint for next year's improvement.
  • The improvement project starts immediately. Your next-year submission window opens in four months. The teams that go from C to B do so because they read the feedback in January and start fixing gaps in February, not in May.

Worked example

HUL (Hindustan Unilever), India. Has consistently scored A or A minus on CDP Climate for several years. Their public approach: they treat the scorer feedback report as a strategic document, share it with the executive committee, and budget the next year's environmental data investments against the gaps it identifies. The CDP cycle is integrated into their annual operating rhythm, not bolted on.

This is the practitioner takeaway: A-list scorers do not start CDP work in April. They start in January, the day the feedback lands.

Key Takeaways

  1. The CDP year runs January methodology release, April ORS open, July submission deadline, December or January score release
  2. The ORS uses conditional logic, so answer the Introduction module first to lock the shape of the rest of your questionnaire
  3. Always draft answers in a separate document first; the ORS is browser-based and work can be lost
  4. A clean submission needs at least four roles: disclosure lead, subject matter contributors, reviewer, and sign-off authority
  5. The scorer feedback report you receive in January is the most valuable improvement document of the year, so plan the next cycle from it

Knowledge Check

Test what you just learned

6 questions ยท check each one as you go

0 of 6 answered

What does ORS stand for?

When does the typical CDP cycle's submission deadline fall?

What is the smartest way to handle the conditional logic in the ORS?

True or false: Once you click submit, you can re-edit and re-submit your response.

Which roles should a strong CDP response team include?

Select all that apply

Match each phase of the CDP year to what happens.

Match each item to its pair

January-March

April

May-mid-June

Late July

December-January

We simplify.
We show you the source.
We make the work easy for you.

This is the whole deal.

โ€” GREENTRYST