enrich_reddit_thread
enrich_reddit_thread MCP tool: trigger Bright Data scrape of a Reddit thread. Charges 10 AI credits before triggering, idempotent on in-progress scrapes, async (1-3 min).
Updated 2026-04-26
enrich_reddit_thread
enrich_reddit_thread triggers a Bright Data scrape of a Reddit thread to pull the post body, top comments, upvotes and post date into the thread record. After the scrape lands, a follow-up get_reddit_thread (or list_reddit_threads) returns the full content so your agent can reason on it. Mentionable does not return a relevance score or suggested angle — the agent calling the MCP runs its own analysis.
When to use
Call it when list_reddit_threads returns a thread in NEW status and your agent decides it deserves a deeper read before engaging or dismissing. A typical loop: list threads with filters.status: ["NEW"], sort by score_desc, enrich the top N, poll get_reddit_thread until ENRICHED, then read content.postBody and content.topComments to draft a reply.
Requires
memberrole minimum,customerrole rejected.
Billing
Each call charges AI_CREDIT_COST_REDDIT_ENRICH (10 credits) before triggering the Bright Data scrape. If the workspace cannot afford it, the call returns success: false with credit details and no work is started. The charge is bypassed in development (NODE_ENV=development). Credits used by MCP calls aggregate with in-app credits in the same monthly counter; purchased credit add-ons (aiCreditsPurchased) count against the budget.
Idempotency
If the thread is already in ENRICHING status, the call is a no-op: it returns success: true, alreadyInProgress: true and does not re-charge. This makes it safe to retry on transient errors. If the thread is ENRICHED or DELETED, the call will charge again and re-trigger — call get_reddit_thread first if you want to avoid that.
Async
The scrape is asynchronous and typically lands in 1 to 3 minutes. Poll get_reddit_thread with a 30-second interval until status: "ENRICHED" (or "DELETED" if the post is gone on Reddit).
Input
| Field | Type | Description |
|---|---|---|
projectId |
string (CUID) | Project the thread belongs to. |
redditPostId |
string (CUID) | Thread to enrich. |
Response
Success — scrape triggered
{
"success": true,
"alreadyInProgress": false,
"thread": {
"id": "clxred001",
"url": "https://reddit.com/r/SEO/comments/abc123",
"status": "ENRICHING"
},
"credits": { "charged": 10, "remaining": 1240 },
"message": "Enrichment started. Poll get_reddit_thread to watch for status: ENRICHED (typically 1-3 minutes)."
}
Success — already in progress (no charge)
{
"success": true,
"alreadyInProgress": true,
"thread": {
"id": "clxred001",
"url": "https://reddit.com/r/SEO/comments/abc123",
"status": "ENRICHING"
},
"message": "Enrichment already in progress. Poll get_reddit_thread to watch for status: ENRICHED."
}
Error — insufficient credits
{
"success": false,
"error": "Not enough AI credits (4 remaining, need 10). GROWTH plan includes 500 credits/month.",
"credits": {
"required": 10,
"remaining": 4,
"monthlyAllowance": 500,
"tier": "GROWTH"
}
}
Error — thread not found or cross-project
{
"success": false,
"error": "Reddit thread not found or does not belong to this project"
}
Tips and patterns
- Always check
successfirst, thenalreadyInProgressto know whether you were charged. - Cap your enrichment loop. Burning the whole monthly credit budget on Reddit threads when the user wanted competitor research is a real failure mode — set an upper bound per agent run.
- After enrichment, check
content.isDeletedOnReddit. Dead leads should be markedSKIPPEDviabulk_update_reddit_thread_status. - In development,
creditsreturns{ devBypass: true }so you know no real budget was touched. - The credit check reads the same monthly counter as the in-app feature: an MCP-driven enrichment loop will consume from the dashboard budget visible at Settings → Billing.
Related tools
- list_reddit_threads — list candidates with
filters.status: ["NEW"]. - get_reddit_thread — poll the enrichment status.
- bulk_update_reddit_thread_status — mark the thread after the scrape (commented, skipped).