Case Study

I turned 90 minutes of busywork into a 30‑second morning check.

rcs construction's BD team was burning 90 minutes a session on manual lead research. I built a system that has everything scraped, analyzed, and waiting before anyone gets to the office.

Scroll

Act I / The Problem

Every morning, the BD team opened a dozen browser tabs and started scrolling.

0
Per session, just to separate real leads from noise
0
News sites, municipal portals, and tender boards to check every morning
0
Prioritization. No filters. No analysis. Just raw, endless reading.

The business development team, the people who should be closing deals, were spending 90 minutes every session sifting through noise to find the one article that actually mattered. Whale activism, pokemon events, local fundraisers, all mixed in with real construction leads. No way to filter. No way to prioritize. Just reading everything and hoping you don't miss the $2M project buried on page three.

"What if the leads were already found, analyzed, and waiting before anyone sits down?"

The question I kept coming back to

Act II / The System

Built it in weeks.
Runs itself forever.

Five stages. Zero human intervention. Fresh leads every morning at 8:00 AM.

AI OK
IF AI FAILS
01
Collect
Web Scraper
02
Enrich
AI Analysis
03
Fallback
Backup Engine
04
Classify
Priority Tagger
05
Deliver
The Dashboard

The Dashboard / Live

The Pipeline, styled like a morning broadsheet. Leads scraped, enriched, tagged, and laid out like front-page stories before anyone gets to the office.

Act III / The Setback

ERROR 403 // Bot detection triggered

Then the primary source shut me out.

AllNovaScotia, the main data source, implemented bot detection and blocked automated access. Weeks of work, broken overnight.

The easy move? Say "the site changed, nothing I can do."

I didn't take the easy move.

The Pivot

The value was never in scraping one site. It was in saving the team hours by aggregating scattered information. So I rebuilt the architecture to be source-agnostic. The enrichment pipeline, the dashboard, the priority tagging, all of it works regardless of where the raw data comes from.

Act IV / The Results

Before and after.
Same team, different tool.

90 min per session
0
Manual Research Time
12+ sources to check
1
Single Dashboard
Zero prioritization
Auto
AI Priority Detection
Manual daily routine
100%
Team Adoption
The team doesn't open a dozen tabs anymore. They open one page that looks like a morning newspaper. By the time they sit down, everything's already there.
Daily workflow, rcs construction

What's Next

The lead scraper was the first tool I built here. It cut hours of manual work without replacing anyone, it just lets the team focus on what they're actually good at.

The same approach works for RFP analysis, project estimation, bid tracking, and a dozen other workflows where the bottleneck is information, not people.