LeadHaste
← Back to Blog

I Built a LinkedIn Ad Scraper in a Weekend (No Code Required)

Dimitar Petkov·Mar 17, 2026·6 min read
I Built a LinkedIn Ad Scraper in a Weekend (No Code Required)

I built a LinkedIn Ad Library scraper from scratch using Claude Code. Zero coding background. It imports 999 Capterra software keywords, runs a two-phase Playwright scraper, rotates residential proxies automatically, and schedules itself daily.

The entire pipeline would cost thousands per month as a SaaS product. I built it in a weekend.

Why LinkedIn Ad Data Is Pure Gold

Companies running LinkedIn ads are spending money. That's not just activity, that's a buying signal.

When a B2B company invests in LinkedIn advertising, they're telling you three things:

  1. They have budget allocated for growth
  2. They're actively looking for customers
  3. They're willing to pay premium prices for acquisition

This is intent data in its purest form. Not someone who downloaded a whitepaper or visited a comparison page. Not a LinkedIn profile view or a website visit. Actual marketing spend.

The problem? Most agencies and sales teams have no systematic way to capture this signal. They're stuck buying the same generic intent data as their competitors, hoping to find buyers in a sea of tire-kickers.

The Old Way: Pay for What Everyone Else Has

Traditional intent data providers charge $2,000-$5,000 per month for signals that every other agency can access. You get:

  • Website visitor data (anonymous)
  • Content consumption tracking (delayed)
  • Technographic data (static)
  • Generic buying signals (shared with 1,000 other customers)

The data is useful, but it's not exclusive. When you're prospecting the same companies as 50 other agencies using the same data source, you lose your edge.

Custom data collection used to require either:

  • Hiring a developer ($80-150/hour)
  • Subscribing to expensive niche data providers ($3,000+/month)
  • Building an in-house data team (six-figure commitment)

For most agencies, these options were out of reach. So they settled for generic signals and competed on execution rather than information advantage.

How AI Changed the Game

AI-powered development tools like Claude Code, Cursor, and GitHub Copilot have collapsed the barrier between "I want this" and "I have this."

Here's what my LinkedIn Ad scraper does:

Phase 1: Advertiser Discovery

  • Imports 999 software keywords from Capterra
  • Scrapes LinkedIn's Ad Library for each keyword
  • Collects all companies running ads in those categories
  • Stores advertiser names and LinkedIn URLs

Phase 2: Company Enrichment

  • Takes each discovered advertiser
  • Enriches with employee count, industry, LinkedIn ID
  • Validates and deduplicates across historical runs
  • Exports to a clean CSV for outreach

Infrastructure Features

  • Rotates residential proxies to avoid rate limits
  • Schedules daily runs automatically
  • Handles errors and retries gracefully
  • Logs everything for debugging

I didn't write this code myself. I described what I wanted to Claude Code, refined the prompts based on errors, and let it build the entire pipeline. The whole process took a weekend, mostly spent testing and tweaking proxy settings.

What This Means for Agencies

The agencies that figure out custom data collection will have an unfair advantage.

Instead of prospecting companies that "visited a pricing page," you can target companies that are:

  • Running ads for competitor products
  • Advertising job openings for roles you serve
  • Sponsoring industry events
  • Launching new product lines
  • Expanding into new markets

Each of these signals can be scraped, tracked, and turned into a prospecting list. And because you're the only one with this exact data set, you're not competing with 50 other agencies for the same prospects.

The agencies that don't adapt? They'll keep buying the same lists as their competitors and wondering why their cold emails get ignored.

Building Your Own Intent Data Pipeline

You don't need to be technical to do this. You need to be specific about what you want.

Here's the framework:

1. Identify Your Buying Signal

What action indicates a company is ready to buy? For me, it was "running LinkedIn ads in software categories." For you, it might be:

  • Posting job openings for specific roles
  • Publishing case studies in your niche
  • Attending industry conferences
  • Raising funding rounds
  • Launching new products

2. Find Where That Signal Lives

LinkedIn Ad Library, job boards, press release sites, event directories, product launch platforms. Most of this data is public, you just need to know where to look.

3. Describe the Pipeline

Tell Claude Code (or your AI tool of choice):

"Build a scraper that checks [data source] daily for [specific signal], enriches each result with [company data points], deduplicates against previous runs, and exports to CSV."

4. Refine Based on Errors

The first version won't be perfect. You'll hit rate limits, miss edge cases, or find data quality issues. Copy the error messages back to the AI and ask it to fix them. Repeat until it works.

5. Schedule and Monitor

Set it to run daily (or weekly). Check the output regularly. Adjust your targeting criteria as you learn what converts.

The Competitive Advantage

Custom intent data isn't just about finding more leads. It's about finding different leads.

When you prospect companies based on signals no one else is tracking, you:

  • Reach buyers before your competitors do
  • Lead with insights they haven't heard before
  • Avoid saturated prospect lists
  • Command higher prices because your targeting is unique

This is the same advantage that private equity firms have when they use proprietary deal flow sources. Or that hedge funds have when they track alternative data sets. Information asymmetry creates pricing power.

For agencies, that pricing power shows up as higher close rates, shorter sales cycles, and the ability to charge premium rates because your prospecting is demonstrably better.

What Happens Next

The barrier to custom intent data just dropped to zero. If you can describe what you want, you can build it.

No more waiting for vendors to add features. No more paying for data you don't need. No more settling for generic signals everyone else has.

The question isn't whether AI will change how agencies do prospecting. It's whether you'll be one of the agencies that adapts early or one that realizes too late.

Start small. Pick one buying signal you wish you could track. Spend a weekend building a scraper. See what you find.

The agencies that figure this out in 2025 will have a two-year head start on everyone else. The ones that wait will spend those two years buying the same lists as their competitors and wondering why their outbound stopped working.

Ready to build outbound that compounds?

We'll build the entire system for your business. $7K+ in services, free — you only cover the infrastructure.

Get Your First Campaign Build →