LinkedIn engager scraper for outbound lead capture
Turn the people quietly liking and commenting on LinkedIn posts about your space into a deduped pipeline of warm leads — refreshed every day, no manual scrolling.
Build this with agnt_
Skip the copy-paste. We'll spin up a builder session prepopulated with this blueprint's spec — providers, schedule, database schema, and the questions the agent should ask you to personalize it for your product.
Sign up free · no credit card
The motion
Every day, this agent scans LinkedIn for posts in your topic space, pulls every reactor and commenter on the most engaged 3–5 posts, and adds their profile URLs to a deduped `leads` table in your workspace database. Engagement is the strongest free intent signal LinkedIn gives you — most teams just never harvest it. This blueprint stands up a clean lead table on day one and puts the harvest on autopilot, so your outbound stack starts with a fresh stream of pre-warmed prospects.
Every reactor and commenter on the most engaged 3–5 posts in your topic space, upserted daily into a clean `leads` table the blueprint stands up on day one.
These prospects literally raised their hand on a post about your space. Your opener writes itself — reference the post they engaged with.
Stop sending SDRs to manually click through reactor lists. The agent rotates search terms, caps API spend, and ships a Slack summary so you see exactly what landed each day.
LinkedIn search, reactions, and comments all go through the agntdata unified API — no LinkedIn scraping infra, no proxies, no separate billing relationship.
Outbound teams already know LinkedIn engagement is the best free intent signal — they just never act on it because pulling reactors and commenters by hand is brutal. This blueprint moves that work to an agent so every day the highest-intent profiles in your space land directly in your leads table, deduped and ready to work.
Build this with agnt_
Skip the copy-paste. We'll spin up a builder session prepopulated with this blueprint's spec — providers, schedule, database schema, and the questions the agent should ask you to personalize it for your product.
Sign up free · no credit card
Or copy a prompt into another platform
Prefer to build with OpenClaw, Hermes, or Claude Code? Drop this prompt into your agent of choice — it seeds the goal, the agntdata endpoints to use, and a step-by-step plan.
You are helping me build a LinkedIn engager-scraping agent that runs on a daily cron. It searches LinkedIn for posts where people are talking about (or engaging with) topics adjacent to my product, pulls every reactor and commenter on those posts, and upserts their profile URLs into my `leads` table so my outbound team can work them.
REFERENCE DOCS (read these before writing code)
- Full agntdata API documentation: https://agnt.mintlify.app/apis/overview
- LinkedIn endpoints we'll use (all behind one agntdata key):
- `Search_Posts` — keyword post search
- `Search_Post_by_Hashtag` — hashtag post search
- `Get_Post_Reactions` — reactors on a post
- `Get_Company_Post_Comments` — commenters on a company-page post
- `Get_Profile_Post_Comment` — commenters on a personal-profile post
ABOUT MY PRODUCT
- Product name: <YOUR PRODUCT>
- One-line description: <WHAT IT DOES>
- ICP: <WHO BUYS IT — role + company stage>
- Topics my buyers are likely engaging with on LinkedIn: <3–6 BULLETS>
- Hashtags worth watching: <3–6 HASHTAGS>
WHAT TO BUILD
- A scheduled agent on agntdata that runs once a day (rotate to every 4–6h once you trust it).
- The agent picks its own search terms each run from the topic list above — diversify across runs so you don't keep hitting the same posts.
- Per run: search keyword + hashtag, pick 3–5 fresh posts, fetch reactors + commenters, dedupe, upsert profile URLs into `leads`.
- All orchestration that touches the DB lives in **one TypeScript skill** (`scrape_linkedin_post_engagers`). The agent only chooses search terms and hands a JSON list of posts to the skill — keeps the chat-context small and the writes deterministic.
DATABASE (create both tables — fresh workspace, nothing to migrate)
- `leads` is the destination for upserts. Minimum schema for this skill: `id` (uuid PK, default gen_random_uuid()), `linkedin_profile_url` (text), `first_name` (text), `last_name` (text), `linkedin_source_post_url` (text), `created_at` / `updated_at` (timestamptz). Add a **partial unique index** on `linkedin_profile_url` where it is not null — that's what makes upserts dedup correctly without blocking rows that have other identifiers.
- `linkedin_scraped_posts` is a dedup/audit table. Columns: `post_url` (PK), `post_urn`, `keyword_used`, `author_name`, `post_snippet`, `reactor_count`, `commenter_count`, `leads_added`, `scraped_at`.
DELIVERY
- After each run, the skill posts a summary to Slack (`lead-scraping` channel by default) with: posts processed, total reactors, total commenters, new leads added, and per-post breakdown.
GUARDRAILS
- Don't call LinkedIn directly — always go through the agntdata endpoints so the unified billing and auth handle it.
- Cap at 5 posts per run. The skill does this; don't override.
- Skip posts that already exist in `linkedin_scraped_posts` — even if they "look fresh."
- Don't store full post bodies. Truncate `post_snippet` to 200 chars.
When you're ready, start by asking me the ABOUT MY PRODUCT questions.Paste into OpenClaw to scaffold this agent. Tweak the inputs and goal at the top of the prompt.
How to build it
7 steps. Each one links to the underlying agntdata endpoints — open them in a new tab to inspect parameters and pricing as you build.
Sign up at app.agntdata.dev/dashboard. One key gives you LinkedIn search, reactions, and comments — plus Reddit, X, TikTok, Instagram, Facebook, YouTube, and Hunter on the same credential. Credit-based pricing, no monthly minimum.
On the agntdata dashboard, install the Slack connector for your workspace and pick the channel that should receive the daily summary (`lead-scraping` is the convention, but anything works). The connector handles the bot token and `chat:write` scope for you.
The blueprint creates a fresh `leads` table in your workspace database with the minimum columns this skill needs: `id` (uuid PK), `linkedin_profile_url`, `first_name`, `last_name`, `linkedin_source_post_url`, plus `created_at` / `updated_at`. The dedup behavior comes from a **partial unique index** on `linkedin_profile_url` where it is not null — that way later enrichment can add rows keyed by email or X username without colliding. Don't overspec the table; grow it as you bolt on enrichment and outbound steps.
The CTA above seeds a builder session prefilled with this blueprint — providers, the `linkedin_scraped_posts` schema, the skill spec, and the personalization questions you need to answer (product, ICP, topics, hashtags, channel). The agent walks you through each one and generates the deployable agent + schedule.
Before turning on the daily schedule, ask the agent to scrape a single hand-picked post — pick one where you know the audience is your ICP. Inspect the rows it writes to `leads` and confirm the `linkedin_source_post_url` column is set. If profile URLs are missing, the response shape walker probably needs a new key — flag it and the agent will adjust.
Default cadence is once a day at 09:00 in your timezone. Cron expression `0 9 * * *`. Tail the Slack channel after the first run to confirm posts processed > 0 and new leads added > 0. If you want more volume, move to every 5 hours (`0 */5 * * *`) once you trust the term selection.
Watch the first 3–5 daily summaries. The fastest wins: (1) add hashtags your ICP actually uses (not the obvious ones), (2) prune topics that surface industry commentary instead of buyer pain, (3) widen the keyword set if reactor counts are tiny. The agent picks its own terms each run from the topic list — keep the list curated and it'll do the rest.
Endpoints used
The agntdata endpoints this blueprint depends on. All available with one API key.
Search Posts
/search-posts
Keyword search for LinkedIn posts. Called once per chosen keyword each run; the agent diversifies terms across runs so the same posts do not keep surfacing.
View endpoint docsSearch Post by Hashtag
/search-posts-by-hashtag
Hashtag search for LinkedIn posts. Complements keyword search by catching community-tagged content (e.g. `#aiagents`, `#gtm`) that would not show up under literal phrases.
View endpoint docsGet Post Reactions
/get-post-reactions
Lists everyone who reacted (liked, celebrated, etc.) to a chosen post. Called once per selected post inside the TypeScript skill.
View endpoint docsGet Company Post Comments
/get-company-post-comments
Lists commenters on a company-page post. The skill dispatches to this endpoint when `is_company_post` is true.
View endpoint docsGet Profile Post Comment
/get-profile-posts-comments
Lists commenters on a personal-profile post. The skill dispatches to this endpoint when the post is from an individual rather than a company.
View endpoint docsShip this blueprint today
One click spins up a builder session prefilled with this blueprint's spec. We'll ask you a handful of personalization questions, then generate the agent.
Related blueprints
Browse all →Hiring for a role is the loudest buying signal LinkedIn gives away for free. This agent watches it daily — captures every company posting jobs in your buyer's role family, plus the hiring team behind each post — and writes them to a deduped pipeline of accounts + decision-makers ready for outreach.
Every day, find creators posting about your space on LinkedIn + X — filter for 1k+ followers and topic-relevance, enrich with verified emails, save to a deduped partnerships table. Pair with the creator outreach writer to actually pitch them.
Monitor 15+ subreddits twice a day for prospects describing the exact pain your product solves. AI-scored, deduplicated, and pushed to Slack — for around $4 a month.