Adligator Team·
Meta Ad Library API connecting to automated competitor monitoring pipeline

How to Use Meta Ad Library API for Automated Competitor Monitoring at Scale

The Meta Ad Library API offers programmatic access to the same data available through the web-based Ad Library. For technical teams, this opens the door to automated competitor monitoring, custom dashboards, and data pipelines that manual browsing cannot match.

But there is a significant gap between what the API promises and what it delivers in practice. Rate limits, data restrictions, missing fields, and the engineering overhead of building and maintaining a monitoring system make the DIY approach more complex than most teams anticipate.

This guide covers how to set up the API, what you can build with it, where it falls short, and when an ad spy tool like Adligator is the more practical choice.

What Is the Meta Ad Library API

The Meta Ad Library API is a public API that provides access to ads data from Meta's transparency initiative. It was originally created for political ad transparency but has been extended to cover all ad categories.

What the API provides:

  • Active ad data for any advertiser on Meta platforms
  • Ad creative content (text, links)
  • Start and end dates
  • Platform distribution (Facebook, Instagram, Messenger, Audience Network)
  • Page name and ID
  • For political/social issue ads: spend ranges, impressions, and demographic data

What the API does NOT provide:

  • Engagement metrics (likes, comments, shares)
  • Ad performance data (CTR, conversions, ROAS)
  • Precise spend data for commercial ads
  • Historical data for deactivated commercial ads
  • Creative files (images/videos — only links)
  • Landing page URLs
  • Audience targeting parameters

How to Get API Access

Step 1: Create a Meta Developer account Go to developers.facebook.com and create an account if you do not have one.

Step 2: Create a new app In the Meta Developer Dashboard, create a new app with type "Business." Name it something descriptive like "Competitor Monitoring."

Step 3: Add the Ad Library API product In your app settings, add the Marketing API product, which includes Ad Library API access.

Step 4: Generate an access token Use the Graph API Explorer or your app's access token to authenticate API calls. Note: long-lived tokens require app review for production use.

Step 5: Read the documentation Review the Ad Library API docs at developers.facebook.com/docs/marketing-api/reference/ad-library/. The API follows the Graph API conventions.

Key Endpoints and Parameters

The primary endpoint is:

GET /ads_archive

Key parameters:

  • search_terms — keyword search
  • ad_reached_countries — filter by country (ISO codes)
  • search_page_ids — filter by specific Facebook page IDs
  • ad_type — "ALL" or "POLITICAL_AND_ISSUE_ADS"
  • ad_active_status — "ACTIVE", "INACTIVE", or "ALL"
  • fields — specify which data fields to return
  • limit — results per page (max 1000)

Available fields include:

  • id, ad_creation_time, ad_delivery_start_time
  • ad_delivery_stop_time (if stopped)
  • ad_creative_bodies (text content)
  • ad_creative_link_captions, ad_creative_link_titles
  • page_id, page_name
  • publisher_platforms (facebook, instagram, etc.)
  • bylines (for political ads)
  • For political ads: spend, impressions, demographic_distribution

Example API call:

curl -G \
  "https://graph.facebook.com/v19.0/ads_archive" \
  -d "search_terms=skincare" \
  -d "ad_reached_countries=['US']" \
  -d "ad_active_status=ACTIVE" \
  -d "fields=id,ad_creation_time,ad_creative_bodies,page_name,publisher_platforms" \
  -d "limit=100" \
  -d "access_token=YOUR_TOKEN"

Building a Basic Monitoring Script

Here is a simplified monitoring workflow using the API:

1. Daily collection script: Run a scheduled script that queries the API for your competitors and niche keywords. Store results in a database (PostgreSQL, SQLite, or even Google Sheets for small scale).

2. Change detection: Compare today's results against yesterday's. Flag:

  • New ads (ad IDs not seen before)
  • Stopped ads (previously active IDs now inactive)
  • New advertisers in your keyword space

3. Alerting: Send notifications (email, Slack webhook) when changes are detected.

Basic Python pseudocode:

import requests
import json

def fetch_ads(keyword, country, token):
    url = "https://graph.facebook.com/v19.0/ads_archive"
    params = {
        "search_terms": keyword,
        "ad_reached_countries": json.dumps([country]),
        "ad_active_status": "ACTIVE",
        "fields": "id,ad_creation_time,ad_creative_bodies,page_name",
        "limit": 100,
        "access_token": token
    }
    response = requests.get(url, params=params)
    return response.json().get("data", [])

# Fetch, compare with stored data, alert on changes

This basic setup gives you automated monitoring for a handful of competitors and keywords.

Limitations of the API Approach

Building on the API sounds straightforward until you hit these walls:

Rate limits: The API enforces strict rate limits. High-volume queries (many competitors × many keywords × many countries) quickly exhaust your quota. Scaling to monitor 50+ competitors across multiple GEOs requires careful throttling and may need multiple apps or approved Business Manager accounts.

No creative files: The API returns text content and links but not actual image or video files. To analyze visual creatives, you need to separately download them — adding complexity and storage requirements.

No longevity data: There is no "days active" field. You can calculate it from ad_creation_time to current date, but only for currently active ads. Once an ad is deactivated, commercial ad data becomes unavailable.

No engagement metrics: Likes, comments, shares — none of these are available through the API. You cannot sort or filter by performance signals.

No landing page data: The API does not consistently expose destination URLs. You get link captions and titles but not always the actual click-through URL.

No duplicate detection: You cannot determine how many ads use the same creative without building your own deduplication system.

Engineering overhead: Building and maintaining a reliable monitoring system requires: database setup, scheduled jobs, error handling, rate limit management, change detection logic, alert delivery, and a frontend for reviewing results. For a marketing team, this is weeks of engineering time.

API instability: Meta occasionally changes API behavior, deprecates fields, or adjusts rate limits. Maintaining an API-based system requires ongoing engineering attention.

When DIY API Falls Short

The API approach makes sense when:

  • You have dedicated engineering resources
  • You need custom data pipelines for specific analytics
  • You are building a product that integrates ad library data
  • Your monitoring needs are narrow (1-2 competitors, 1 keyword)

The API approach breaks down when:

  • You need to monitor 10+ competitors across multiple GEOs
  • Visual creative analysis is important (images, videos)
  • You want longevity-based filtering (days active)
  • You need live monitoring with instant alerts
  • Your team lacks engineering resources to build and maintain the system
  • You need domain, IP, or OS-based filtering

How Ad Spy Tools Solve API Limitations

Ad spy tools like Adligator build on top of Meta's data layer but add everything the raw API lacks:

What Adligator provides that the API cannot:

  • Visual creative database — browse actual ad images and videos, not just text
  • Days-active filter — the most powerful profitability signal, pre-calculated
  • Domain zone filtering — identify affiliate/arbitrage patterns by TLD
  • IP address filtering — find all ads from the same infrastructure
  • OS targeting — see iOS vs. Android targeting
  • Duplicate detection — know which creatives are being scaled
  • Live trackers — saved searches with automatic new-match detection
  • Collections — save and organize winning creatives
  • No engineering required — ready to use in minutes, not weeks

The cost comparison: Building a robust API-based monitoring system costs engineering time (40-80 hours for MVP, ongoing maintenance) plus infrastructure (hosting, database, alerting service). At a $100+/hour engineering rate, the build alone costs $4,000-$8,000.

Adligator Pro costs $32/month. Even over a year ($384), it is a fraction of the engineering cost — and comes with features you could not replicate from the API alone.

Skip the API hassletry Adligator for instant competitor monitoring

Adligator vs DIY API Monitoring

DIY API monitoring:

  • ✅ Full data ownership and customization
  • ✅ No recurring subscription cost (beyond hosting)
  • ❌ Weeks of engineering to build
  • ❌ Ongoing maintenance for API changes
  • ❌ No visual creatives
  • ❌ No longevity filtering
  • ❌ No domain/IP/OS filters
  • ❌ No live monitoring alerts
  • ❌ Rate limit management required

Adligator:

  • ✅ Ready in minutes, no engineering
  • ✅ Visual creative browsing
  • ✅ Days-active, domain zone, IP, OS filters
  • ✅ Live trackers (7-14)
  • ✅ 234 countries
  • ✅ $32/month Pro
  • ❌ No raw data export (for custom analytics pipelines)
  • ❌ Meta-only (no TikTok, YouTube, native)

For 95% of marketing teams, Adligator provides everything needed for competitor monitoring without the API engineering overhead. The 5% who need custom data pipelines or integration with internal analytics may benefit from the API — but should still consider Adligator for daily monitoring alongside their custom build.

FAQ

Is the Meta Ad Library API free?

Yes, the API is free to use. You need a Meta Developer account and an approved app. However, there are rate limits, restricted data fields, and the API only returns currently active ads for most ad categories.

Can I get competitor ad spend from the API?

Only for ads about social issues, elections, or politics. Regular commercial ad spend is not available through the API. You can infer relative spend from ad volume and longevity.

Is Adligator a replacement for the Meta Ad Library API?

For most marketing teams, yes. Adligator provides what the raw API cannot: creative analytics, longevity tracking, domain zone filtering, live monitoring trackers, and a visual interface — without engineering overhead.

What programming language should I use for the API?

Python is the most common choice due to its requests library and data processing ecosystem (pandas, SQLAlchemy). JavaScript/Node.js is a good alternative. Any language that can make HTTP requests and parse JSON works.

Conclusion

The Meta Ad Library API is a powerful free resource for programmatic access to advertising data. For technical teams building custom analytics or products, it provides the foundation for automated monitoring.

But for marketing teams who need actionable competitive intelligence — visual creatives, longevity filtering, domain analysis, live monitoring — the raw API is just the starting point. Building a complete monitoring system on top of it requires significant engineering investment that most teams cannot justify when purpose-built tools exist.

Adligator delivers what the API cannot: visual creative browsing, days-active filtering, domain/IP/OS analysis, and live trackers — ready to use in minutes at $32/month.

Ready to apply this workflow? Skip the API hassle — try Adligator for instant competitor monitoring

See what Adligator monitors automatically

Support:
2026 Adligator Ltd All rights reserved
Adligator Ltd - Registered in England and Wales, 16889495. 3rd Floor, 86-90 Paul Street, London, England, United Kingdom, EC2A 4NE