# SERP Collection Strategy

**Last Updated:** 2026-01-11

## Overview

SERP (Search Engine Results Page) data collection provides insights into top-ranking competitors for target keywords. However, SISTRIX API's `keyword.domain.seo` endpoint with `kw` parameter is expensive (100 credits per keyword).

## Cost Analysis

**SISTRIX API SERP Endpoint:**
- Endpoint: `keyword.domain.seo` with `kw` parameter
- Cost: 100 credits per keyword (fixed, regardless of limit)
- Returns: Top ranking domains for keyword (up to limit specified)

**Collection Scenarios:**

1. **All Keywords (700 keywords):**
   - Cost: 700 × 100 = 70,000 credits ❌ **Exceeds weekly limit**

2. **Primary Keywords Only (99 keywords):**
   - Cost: 99 × 100 = 9,900 credits ❌ **Exceeds weekly limit**

3. **Top 20 High-Value Keywords:**
   - Cost: 20 × 100 = 2,000 credits ✅ **Within budget, but expensive**

4. **Skip SERP Collection:**
   - Cost: 0 credits ✅ **Use GSC data instead**

## Recommended Strategy: Use GSC Data

**Decision:** Skip expensive SISTRIX SERP collection, use Google Search Console (GSC) data instead.

**Rationale:**

1. **Cost Efficiency:** GSC data is free and already collected
2. **Real Performance Data:** GSC shows actual search performance (clicks, impressions, position)
3. **Comprehensive Coverage:** GSC data covers all keywords, not just top 20
4. **Regular Updates:** GSC data is collected regularly and stays current

**GSC Data Available:**

- Top queries per post (from `performance-gsc.json`)
- Average position per keyword
- Clicks and impressions
- CTR (Click-Through Rate)
- Search performance trends

**Use Cases:**

- **Competitive Analysis:** Use GSC position data to identify ranking opportunities
- **SERP Analysis:** Analyze top queries to understand search intent
- **Content Optimization:** Use position data to prioritize content improvements

## Alternative: Selective SERP Collection

If SERP data is needed for specific high-value keywords, use the `collect-post-serp-data.php` script:

```bash
# Collect SERP for top 20 primary keywords
php v2/scripts/blog/collect-post-serp-data.php --limit=20

# Collect SERP for specific post
php v2/scripts/blog/collect-post-serp-data.php --post=slug --category=category
```

**When to Use:**

- Manual competitive analysis for specific keywords
- Content gap analysis for high-value keywords
- One-time research projects

**Cost:** 100 credits per keyword

## Implementation

**Current Status:** SERP collection is optional and not included in standard data collection workflow.

**Data Sources:**

1. **GSC Data** (Primary): `{post}/data/performance-gsc.json`
   - Top queries, positions, clicks, impressions
   - Collected regularly via `collect-post-performance-gsc.php`

2. **SISTRIX SERP Data** (Optional): `{post}/data/serp-results.json`
   - Top 10 ranking domains per keyword
   - Collected on-demand via `collect-post-serp-data.php`

## Documentation

SERP analysis can be included in SEO reports using GSC data:

- **Top Queries:** From GSC performance data
- **Average Position:** From GSC metrics
- **Competitive Analysis:** Based on GSC position data
- **Content Opportunities:** Identified from GSC search queries

## Future Considerations

If weekly credit limits increase or SERP endpoint costs decrease, consider:

1. Collecting SERP data for top 50 primary keywords
2. Monthly SERP collection for trending keywords
3. Automated SERP monitoring for high-value keywords

For now, GSC data provides sufficient insights for SEO analysis and content optimization.
