Lessons from the Field: What I Wish I Knew on Day One
This is the final lesson of the course. No new frameworks to learn, no new processes to memorize. Instead, this is the collection of things that only become clear after you have been through a few engagements: the hard-won insights that nobody tells you at the start, because they only make sense once you have lived them.
If you are about to start your first ESG reporting engagement, read this carefully. If you have already been through one, you will probably nod along and wish someone had told you all of this sooner.
"Just Go Ahead and Write It"
This is the single most important piece of advice in this entire course.
When you start an engagement, you will expect the client to provide you with narratives, context, and detailed information about their sustainability initiatives. You will send them questionnaires. You will schedule calls. You will ask them to describe their programs, their strategies, their achievements. And then you will wait.
You will keep waiting.
They will not give it to you. Not because they are uncooperative, but because they are busy, they do not think in "report narratives," and they do not know how to articulate what you need in the format you need it. The sustainability team has a hundred other things on their plate. Writing paragraphs for your report is not their priority.
Here is what you do instead: you write it yourself.
Go through their past sustainability reports. Read their annual reports (the Chairman's message, the MD&A section, the CSR section). Read their policies: environmental policy, human rights policy, health and safety policy. Read their website. Read their press releases. Read their investor presentations. Read everything that is publicly available.
Then sit down and write the sections yourself. Write the company overview. Write the environmental narrative. Write the governance section. Do not wait for perfect information: work with what you have and fill the gaps with reasonable placeholders.
Share your draft with the client. They will read it and say, "This is mostly right, but this part is wrong, and this part needs to be updated, and we actually stopped doing that program last year." Perfect. Now you have their corrections, their updates, and their attention. They will engage with a document that exists far more readily than they will respond to an open-ended request for input.
The workflow is not: client provides information, you write the report. The workflow is: you write the report from available sources, client corrects and refines it. That is how it actually works. Every experienced consultant learns this: the only question is how many frustrating weeks of waiting it takes before you figure it out.
This approach has a second benefit that is easy to miss. When you write the first draft yourself, you control the quality, the tone, and the flow from the very beginning. You are not stitching together paragraphs written by six different departments in six different styles. You are writing one coherent document and getting corrections on it. The final product is almost always better than what you would get by assembling client-provided content.
The Reports That Stand Out
After you have read dozens of ESG reports (your clients' reports, their peers' reports, award-winning reports, and forgettable ones), you start to notice what separates the good from the mediocre.
It is not the framework. Everyone references GRI. It is not the data. Most companies in the same industry report similar metrics. It is not even the design, though that helps.
The reports that stand out get three things right:
1. The balance between visuals, narrative, and data. No single element dominates. A page might have a compelling chart, a short paragraph of context, and a key metric called out in a highlight box: all working together. The reader absorbs information through multiple channels simultaneously. Compare this to reports that are 100 pages of dense text, or reports that are beautiful infographics with no substance behind them. Neither works.
2. They are concise. The best reports say what needs to be said and stop. They do not pad sections to hit a page count. They do not repeat the same policy description in every chapter. They do not include three paragraphs of generic industry context that adds nothing specific to the company. If the report can be 80 pages instead of 150, the better reports are 80 pages.
3. They have a clear editorial voice. The report sounds like it was written by someone who understood the company and cared about communicating clearly. It does not sound like it was generated by a machine or assembled from boilerplate paragraphs.
Think about the last presentation you sat through that actually held your attention. It probably was not the longest one. It probably had clean slides with clear visuals, a speaker who knew the material, and a structure that moved forward without circling back. The same principles apply to sustainability reports. Respect your reader's time and attention, and they will actually read what you wrote.
The AI Slop Problem
It is 2026, and AI-generated content is everywhere in ESG reports. You can spot it. Your clients can spot it. Rating analysts can spot it. And it is becoming a real credibility issue.
Using AI to write first drafts is fine. It is a legitimate tool that can save time on boilerplate sections, data descriptions, and structural outlines. The problem is when the AI-generated text goes into the report without meaningful human editing. The result is what the industry has started calling "AI slop": text that is grammatically correct, superficially coherent, and completely devoid of specificity or genuine insight.
AI slop sounds like this: "The company remains committed to advancing its sustainability journey through a comprehensive approach that integrates environmental stewardship, social responsibility, and robust governance practices across its operations." That sentence says absolutely nothing. It could apply to any company on earth. It is filler masquerading as content.
Major consulting firms have already been penalized for submitting AI-generated deliverables without adequate review. The reputational risk is real.
The rule is simple: AI writes the first draft, you edit it to be human. Make it specific to this company. Remove the empty phrases. Add the details that only someone who understands the business would include. Check every factual claim. If a sentence could appear in any company's report without modification, it does not belong in this one.
The differentiator in 2026 is not who uses AI (everyone does). The differentiator is who edits it well enough that you cannot tell.
People Management IS the Job
Here is something that will surprise you if you are coming from a technical background: the technical work in ESG reporting is the easy part. Understanding GRI, building data templates, writing chapters, running sanity checks: these are skills you can learn from a course (including this one). They are important, but they are not what makes or breaks an engagement.
What makes or breaks an engagement is people management.
Getting data out of departments that do not want to share it. Managing a client who changes their mind about the report's direction after you have written half of it. Handling conflicting feedback from three different stakeholders who all have sign-off authority. Pushing back on unrealistic timelines without damaging the relationship. Navigating the politics of who gets quoted, whose achievements get highlighted, and whose department looks good in the report.
This is the job. The data template is just the tool.
You will work with people who do not understand what you are doing but have strong opinions about how it should be done. You will work with people who will not give you anything during the process and then blame you when the deliverable is late. You will work with people who will review your work and send back nothing but comma corrections when what you needed was substantive feedback.
The pattern every consultant recognizes: You send a draft chapter to the client for review. You ask for feedback on content accuracy, narrative flow, and data completeness. The review comes back two weeks late. The only comments are: "Change this comma to a semicolon," "Can we use a different shade of blue for this heading?" and "Please spell out all abbreviations on first use."
Nothing about whether the content is accurate. Nothing about whether the strategy description reflects reality. Nothing about the data.
You cannot prevent this. But you can prepare for it. When you send chapters for review, include specific questions: "Is the emissions reduction target stated on page 12 still current?" "Does the description of your water management program on page 18 accurately reflect operations at all sites?" Specific questions get specific answers - or at least make it harder to respond with only formatting corrections.
Consistency Is Everything
If there is a single thread that runs through this entire course, it is consistency. Not as an abstract virtue, but as a concrete, measurable quality that separates professional reports from amateur ones.
Boundary consistency. Whatever boundary you define in the "About This Report" section, that boundary applies to every number in the report. You cannot report standalone emissions on page 20 and consolidated workforce data on page 45 without clearly flagging the difference. And if the BRSR uses the same boundary, the numbers must match.
Data consistency. The same metric cannot show different values in different sections of the report. If your total energy consumption appears in the environmental section, in a data table in the appendix, and in the GRI Content Index, all three must be identical.
Language consistency. If you write in third person ("The Company has implemented..."), stay in third person. If the client prefers first person ("We have implemented..."), stay in first person. Do not switch between the two within a chapter, and ideally not within the report.
Theme consistency. If you have established a theme for the report (a narrative thread about resilience, or innovation, or integration), that theme needs to appear in every major section. Not forced, not artificial, but present. If the theme disappears for 40 pages and then reappears in the conclusion, it was not really a theme.
Design consistency. Same fonts, same color scheme, same chart styles, same icon language throughout the document. The environmental section should not look like it was designed by a different agency than the governance section.
Before you submit the final report for design (or for the last round of review), run through these checks:
- Pick any three quantitative metrics. Search for them across the entire report. Do the numbers match everywhere they appear?
- Read the first paragraph of each chapter. Is the voice consistent (first vs. third person, active vs. passive)?
- Look at the report theme. Can you find it referenced or reflected in every major section?
- Check the boundary. Is it stated clearly at the beginning? Does every data point fall within that boundary?
- Open the BRSR (if applicable). Compare at least five key metrics between the two documents.
If any of these checks fails, fix it before the report moves forward. Inconsistency discovered by the reader is far more damaging than inconsistency caught internally.
The Final Word
ESG reporting is not glamorous. It is not fast. It is a long process of collecting data that does not want to be collected, writing narratives for people who cannot articulate what they want, managing timelines that nobody respects, and producing a document that (if you do everything right) earns you a polite "thank you" and a request to do it again next year.
But it matters. A well-written sustainability report does something that few other corporate documents can do: it forces a company to look at itself honestly (at its environmental impact, its treatment of people, its governance practices) and put that on the record. When done well, the report becomes a tool for accountability and a catalyst for improvement. When done poorly, it is just another PDF that nobody reads.
The difference between the two is not the framework you follow or the software you use. It is the care you bring to the work. The willingness to dig through old annual reports at 11 PM to find the right narrative. The discipline to check every number one more time. The patience to manage difficult stakeholders without losing your composure. The honesty to push back when something in the report is not accurate, even when the client would prefer you did not.
You now have the full picture of how ESG reporting works in practice: from the first discovery call to the final delivery. The rest is experience. Go build it.
Consistency, conciseness, and care. If you bring those three things to every engagement, you will produce reports that stand out. Not because they are flashy, but because they are reliable, readable, and real. That is a higher bar than most of what is out there. Meet it.
Key Takeaways
- 1Do not wait for the client to provide narratives - write the first draft yourself from publicly available sources, then let the client correct and refine it
- 2The reports that stand out balance visuals, narrative, and data on every spread, stay concise, and maintain a clear editorial voice
- 3AI-generated content is acceptable for first drafts but must be edited to be specific, substantive, and indistinguishable from human writing - generic filler destroys credibility
- 4People management (not technical skill) is what makes or breaks an engagement - getting data from reluctant departments, managing conflicting feedback, and navigating internal politics is the real job
- 5Consistency is the single thread that ties everything together: boundary consistency, data consistency, language consistency, theme consistency, and design consistency across the entire report
- 6When sending chapters for review, include specific questions about content accuracy rather than open-ended requests - specific questions get specific answers