How We Eliminated 15 Hours a Week of Manual Reporting

Three hours every morning, five days a week

The workflow was this: every morning, a team member would log into four different platforms, download reports in various formats (CSV, Excel, PDF), open each one, extract the relevant columns, paste them into a master spreadsheet, add timestamps, and email the consolidated report to six stakeholders. By 11am, the reporting was done. By then, so was the morning.

This wasn’t a small operation running on shoestring tools. This was an established business with real revenue, doing serious work — and burning 15+ hours a week on copy-paste. The data was important. The process of getting it was not.

“We knew it was wasteful. But every time we looked at automating it, the quotes were six figures and six months. We needed something that worked next week, not next quarter.”

— Operations director

Understanding the actual workflow before automating it

The first thing we did was sit with the person doing the work and watch. Not look at a requirements document — actually watch the process. This revealed two things the brief hadn’t mentioned:

  • Two of the four source platforms had APIs. Two didn’t — they required browser-based logins and manual downloads. Any automation solution needed to handle both.
  • The “master spreadsheet” wasn’t just a data dump — it had conditional formatting, calculated columns, and specific column ordering that stakeholders relied on. The output format mattered as much as the data.

What we built

  1. API integrations for the two platforms that supported them. Straightforward authenticated requests on a schedule, with error handling and retry logic. Data normalised into a common schema on ingestion.
  2. Headless browser automation for the platforms without APIs. We used a Go-based scraper that logs in, navigates to the report section, downloads the file, and extracts the data. This is inherently fragile — UI changes can break it — so we built monitoring that alerts us when a scrape fails, along with a manual fallback path.
  3. Data processing pipeline. All four data sources feed into a processing stage that normalises column names, applies business rules (currency conversions, date standardisation, duplicate detection), timestamps everything, and produces the consolidated output in the exact format the team was used to.
  4. Automated distribution. The finished report is emailed to stakeholders at 8am, before anyone starts their day. The email includes a summary line with key changes from the previous day, so recipients know at a glance whether they need to dig into the detail.
15+ hrsPer week eliminated from manual reporting
8:00 AMReport lands in inboxes — before the team arrives
4Data sources consolidated into one automated pipeline

The parts people don’t talk about

Building the happy path took two weeks. Making it reliable took another three. The difference was error handling:

  • What happens when a source platform is down at 6am? (Answer: retry three times, then send the report with a note about which source is missing.)
  • What happens when the data format changes? (Answer: validation checks flag unexpected schemas and alert us before the pipeline produces bad output.)
  • What happens when a stakeholder is added or removed? (Answer: distribution list managed in a config file, not hard-coded.)

Automation that works 95% of the time and fails silently the other 5% is worse than no automation at all — because people stop checking. Every failure mode needs to be visible.

“The report used to be the first thing on someone’s to-do list every day. Now it’s just there when they arrive. That person spends those three hours on analysis instead of data entry. The quality of our decisions improved.”

— Operations director, post-launch

When to automate reporting (and when not to)

Automate when: the same person does the same data extraction more than three times a week, the sources have APIs or stable web interfaces, and the output format is well-defined.

Don’t automate when: the report changes structure frequently, requires human judgement to compile (not just human effort), or the data sources are genuinely unreliable. In those cases, invest in better tooling for the human rather than trying to remove them.

The goal of automation isn’t to replace the person. It’s to give them back three hours a day to do work that actually requires their brain.