Skip to content
GT
πŸ“‹ Sustainability / ESG Reporting in Practice
Data Collection & The Data Requirement SheetLesson 4 of 59 min read

Data Sanity Checks: The Practitioner's Checklist

Data Sanity Checks: The Practitioner's Checklist

Data has started arriving. The sustainability team has forwarded spreadsheets from HR, operations has sent energy data, finance has shared economic performance numbers. Your instinct might be to plug it straight into the report and start writing. Do not do that.

Every piece of data that enters your report needs to pass through a sanity check. Not a formal audit: that is the assurance provider's job. But a systematic, common-sense review that catches the errors you will absolutely encounter if you do not look for them. The errors in ESG data are rarely subtle. They are decimal points in the wrong place, totals that do not add up, numbers that are physically impossible, and (most dangerously) zeros that are not actually zero.

This lesson gives you the practitioner's checklist. Use it every time new data arrives, not just at the end.

Check 1: Extremes - Zero and One Hundred Percent

The two most dangerous numbers in ESG reporting are 0 and 100%. Both require proof.

If a company reports zero workplace injuries, zero corruption incidents, zero environmental violations, or zero waste to landfill, that is a bold claim. It might be true. But it needs to be backed by evidence, not just the absence of data.

"The absence of evidence is not the evidence of absence."

This is the single most important principle in ESG data verification. If data is missing, it does NOT mean the value is zero. A company that does not track workplace injuries is not a company with zero injuries: it is a company that does not know how many injuries it has. These are fundamentally different things, and confusing them can create serious credibility and legal risks.

When you see a zero in the data, ask: "Is this a confirmed zero backed by a tracking system, or is this a 'we do not have the data so we put zero'?" The answer will determine whether you report it as zero, report it as "not tracked," or flag it as a data gap.

The same logic applies to 100%. If a company claims 100% renewable energy, 100% employee satisfaction, 100% supplier compliance, or 100% of waste recycled, you need to see the evidence. Absolute claims attract scrutiny from assurance providers, rating agencies, and informed readers. If it is genuinely 100%, report it proudly. If it is 98% and someone rounded up, report 98%.

Check 2: Internal Consistency

The same metric must not show different numbers in different places within the report. This sounds obvious, but it happens constantly.

Total energy consumption appears in the environmental section as 45,000 GJ. The data table in the appendix says 44,800 GJ. The CEO's message references "over 50,000 GJ of energy managed efficiently." Three different numbers for the same metric in the same document. This destroys credibility.

How it happens: different sections are written by different people at different times, using different versions of the data. The environmental chapter was written with the first data submission. The appendix was updated with the revised data. The CEO's message was written from memory. Nobody cross-checked.

The fix: Maintain a master data table - one single source of truth. Every section of the report pulls from this table. When data gets updated, the table gets updated, and you search the entire report for every instance of that metric and update them all.

Check 3: Cross-Report Consistency

Your sustainability report does not exist in isolation. The same company likely publishes an annual report, may file a BRSR (in India), responds to CDP questionnaires, and submits data to rating agencies. If the reporting boundary is the same across these documents, the data must match.

This is a common and serious error. The sustainability report says total Scope 1 emissions are 12,500 tCO2e. The BRSR filed with the annual report says 11,800 tCO2e. A rating agency pulls both and immediately flags the inconsistency.

Example: The BRSR mismatch

A company reports water consumption of 2.3 million kilolitres in their sustainability report. Their BRSR, filed six months earlier, reports 1.9 million kilolitres. An investor notices the discrepancy and raises it during a shareholder meeting.

The reason: the sustainability report included data from a newly acquired subsidiary, but the BRSR did not because it was filed before the acquisition closed. Both numbers were technically correct for their respective boundaries, but nobody explained the difference, and neither document noted the boundary change.

The fix: always compare your data against any previously published reports. If boundaries differ, note it explicitly. If they are the same, the numbers must match exactly.

Check 4: Year-on-Year Changes

Significant swings in data from one year to the next need an explanation. Not necessarily in the data itself, but you need to understand why before you put it in the report.

If Scope 2 emissions dropped 40% year-on-year, is that because the company genuinely shifted to renewable energy, or because someone changed the emission factor used for grid electricity? If employee turnover doubled, is that a real HR crisis, or did the company expand its reporting boundary to include contract workers?

Look for:

  • Changes greater than 20-25% in either direction: these always warrant investigation
  • Methodology changes: did the calculation approach change? Different emission factors? Different consolidation method?
  • Boundary changes: were new sites or subsidiaries added or removed?
  • One-off events: major acquisitions, divestitures, plant shutdowns, COVID impacts (still showing up in multi-year data)

If the change is real and explainable, include the explanation in the report. Readers and rating agencies will ask: better to answer proactively than reactively.

Check 5: Decimal Places

This is a small thing that makes a big difference to the credibility of your report. Reporting GHG emissions as 12,456.7893 tonnes CO2e suggests a level of precision that does not exist in reality. Most environmental data involves estimates, assumptions, and emission factors with their own uncertainty ranges. Reporting to four decimal places is false precision.

General rule: Two decimal places maximum for most metrics. For large numbers (revenue, total emissions, energy consumption), zero or one decimal place is usually appropriate. Match the precision to the certainty of the data.

Be consistent too - do not report one metric to four decimal places and another to zero. Pick a convention and stick to it across all data tables.

Check 6: Currency Consistency

If the company operates across geographies or if you are reporting multi-year economic data, currency matters. Are all financial figures in the same currency? Is it the reporting currency of the company? If currency conversions were applied, what exchange rate was used (average for the period, or year-end spot rate)?

This is especially relevant for:

  • Economic performance data (GRI 201)
  • Community investment and CSR spend
  • Procurement data
  • Executive compensation
  • Environmental expenditure

Note the currency and any conversion methodology clearly in the report.

Check 7: GHG Inventory Breakups

The greenhouse gas inventory is one of the most scrutinized sections of any ESG report. The definitions and category breakups must be crystal clear and internally consistent.

Check that:

  • Scope 1, 2, and 3 boundaries are clearly defined and match the company's stated reporting boundary
  • Emission sources within each scope are listed: not just a total number, but what is included
  • The methodology is stated: which emission factors were used, which GHG Protocol approach (location-based vs. market-based for Scope 2)
  • Gases included are noted: CO2 only, or CO2, CH4, N2O, and others?
  • Breakups add up to totals: if you provide Scope 1 by source (stationary combustion, mobile combustion, process emissions, fugitive emissions), those subcategories must sum to the total Scope 1 figure

Think of the GHG inventory like a financial balance sheet. If the line items do not add up to the total, no auditor would sign off on it. The same standard should apply to your emissions data. Every subtotal must reconcile with the grand total, and every category must be defined clearly enough that someone could replicate your calculation.

Check 8: Numbers Must Add Up

This is the most basic check and yet one of the most frequently failed. Totals must equal the sum of their parts. Always.

If you report 5,000 male employees and 3,000 female employees, total headcount must be 8,000 (or include a third category that accounts for the difference). If you report energy consumption by source (20,000 GJ electricity, 15,000 GJ natural gas, 5,000 GJ diesel), total energy must be 40,000 GJ.

This applies to every data table in the report. Every single one. Pull out a calculator and add the columns. It takes five minutes per table and catches errors that would be embarrassing once published.

A simple checklist for every data table before it goes into the report:

  1. Do the rows add up to the totals shown?
  2. Do the percentages add up to 100% (or close to it, with rounding noted)?
  3. Are the units consistent within the table?
  4. Does this data match what is reported elsewhere in the document?
  5. Is the year-on-year change plausible?

If any answer is no, stop and investigate before writing the narrative around the data.

Putting It All Together: A Systematic Approach

Do not try to do all of these checks at the end when the report is nearly finished. By then, corrections are painful and time-consuming. Instead, run sanity checks every time new data arrives:

  1. Immediate check (when data is received): Units correct? Right time period? Obvious gaps or zeros?
  2. Detailed check (before using in report): Internal consistency, year-on-year changes, totals adding up, decimal places
  3. Cross-check (during final QA): Cross-report consistency, GHG inventory reconciliation, currency consistency

As a consultant, you generally cannot verify data against the original source unless the client shares it with you. You are not an auditor. Your sanity checks are about catching obvious errors, inconsistencies, and implausible numbers - not about certifying accuracy.

If the company opts for third-party assurance, that significantly increases confidence in data quality. Here is an important practical tip: try to get the final data only AFTER assurance is complete. If you write the report using pre-assurance data, you will almost certainly need to revise numbers when assurance findings come in. This creates extra work and version control headaches.

If assurance is not being done, be transparent about the limitations. Note in the report that data has not been independently verified. This protects both you and the client.

Data sanity is not about perfection. No ESG report has perfectly precise data: the nature of sustainability metrics involves estimates, assumptions, and judgment calls. But there is a vast difference between data that is imprecise (acceptable and transparent) and data that is wrong (unacceptable and damaging). Your sanity checks are the line between the two.

Key Takeaways

  • 1Always question zeros and 100% claims - absence of data is not evidence of absence. Confirm whether a zero is measured or simply untracked.
  • 2Maintain a single master data table as the source of truth and cross-check every instance of a metric across all sections of the report.
  • 3Investigate year-on-year changes greater than 20-25% before including them - they may reflect methodology or boundary changes, not real performance shifts.
  • 4Ensure GHG inventory subcategories sum to reported totals, and state the methodology, emission factors, and gases included.
  • 5Run sanity checks when data arrives, not just at the end - check units, totals, decimal precision, and cross-report consistency at every stage.

Knowledge Check

1.When checking GHG emissions data before including it in the report, which of the following is NOT part of the GHG inventory sanity check?

2.A report shows 5,000 male employees and 3,000 female employees, but lists total headcount as 8,500. What is the correct response?

3.What is the recommended approach for timing sanity checks during an ESG reporting engagement?