# SISTRIX API Credit Optimization Analysis

**Last Updated:** 2026-01-11

## Current Credit Status

- **Weekly Limit:** 10,000 credits (resets Monday)
- **Current Usage:** 1,999 credits
- **Remaining:** ~8,001 credits
- **Budget for Today:** Up to 5,000 more credits (total ~7K today, leaving ~3K for rest of week)

## Endpoint Cost Analysis

### Keyword Metrics (`keyword.seo.metrics`)

**Individual Calls:**
- Cost: 5 credits per keyword
- Example: 10 keywords = 50 credits, 10 API calls, ~10 seconds

**Batch Mode (Array of Keywords):**
- Cost: 5 credits per keyword (same as individual)
- Example: 10 keywords = 50 credits, 1 API call, ~1 second
- **Efficiency:** 90% reduction in API calls, 90% reduction in time
- **Credit Savings:** 0 credits (same cost)
- **Time Savings:** Significant (1 call vs 10 calls)

**Recommendation:** ✅ Always use batch mode for keyword metrics collection

### SERP Data (`keyword.domain.seo` with `kw` parameter)

**Cost:** 100 credits per call (fixed, regardless of limit parameter)

**Collection Scenarios:**

1. **All Keywords (700 keywords):**
   - Cost: 700 × 100 = 70,000 credits ❌ **Exceeds weekly limit**

2. **Primary Keywords Only (99 keywords):**
   - Cost: 99 × 100 = 9,900 credits ❌ **Exceeds weekly limit**

3. **Top 20 High-Value Keywords:**
   - Cost: 20 × 100 = 2,000 credits ✅ **Within budget**

4. **Skip SERP Collection:**
   - Cost: 0 credits ✅ **Use GSC data instead**

**Recommendation:** 
- Option A: Collect SERP for top 20 primary keywords only (2,000 credits)
- Option B: Skip SERP collection, use GSC data for search performance analysis

### Domain-Level Endpoints (One-Time Collection)

**Total Estimated Cost: ~252 credits**

1. **Domain Opportunities** (`domain.opportunities`)
   - Cost: 1 credit per opportunity returned
   - Limit: 100 opportunities = ~100 credits
   - Use: Identify keyword opportunities for all posts

2. **Domain Competitors** (`domain.competitors.seo`)
   - Cost: 1 credit per competitor returned
   - Limit: 50 competitors = ~50 credits
   - Use: Competitive analysis for all posts

3. **Ranking Distribution** (`domain.ranking.distribution`)
   - Cost: 1 credit
   - Use: Overall ranking performance insights

4. **Traffic Estimation** (`domain.traffic.estimation`)
   - Cost: 1 credit
   - Use: Domain traffic estimates

5. **Domain Keywords** (`keyword.domain.seo` with `domain` parameter)
   - Cost: 1 credit per keyword returned
   - Limit: 100 keywords = ~100 credits
   - Use: Top keywords domain ranks for

**Recommendation:** ✅ Collect once, reuse for all posts

## Credit Budget Breakdown

### Remaining Credits: ~8,001

### Planned Usage:

1. **Keyword Collection for 34 Missing Posts**
   - Posts: 34
   - Keywords per post: ~7
   - Total keywords: ~238
   - **Batch mode:** 24 batches × 50 credits = ~1,190 credits
   - **Individual mode:** 238 × 5 = 1,190 credits (same cost, but slower)

2. **Domain-Level Data (One-Time)**
   - Opportunities: ~100 credits
   - Competitors: ~50 credits
   - Ranking distribution: 1 credit
   - Traffic estimation: 1 credit
   - Domain keywords: ~100 credits
   - **Total: ~252 credits**

3. **SERP Data (Optional)**
   - Option A: Top 20 primary keywords = 2,000 credits
   - Option B: Skip = 0 credits
   - **Recommendation:** Option B (use GSC data instead)

4. **Buffer for Errors/Retries**
   - Estimated: ~500 credits

### Total Planned Usage:

**With SERP Collection (Option A):**
- Keywords: 1,190 credits
- Domain-level: 252 credits
- SERP: 2,000 credits
- Buffer: 500 credits
- **Total: ~3,942 credits**
- **Remaining: ~4,059 credits**

**Without SERP Collection (Option B - Recommended):**
- Keywords: 1,190 credits
- Domain-level: 252 credits
- SERP: 0 credits (use GSC)
- Buffer: 500 credits
- **Total: ~1,942 credits**
- **Remaining: ~6,059 credits**

## Optimization Strategies

### 1. Batch Processing

**Implementation:**
- Use `keyword.seo.metrics` with array of keywords (batch mode)
- Process keywords in batches of 10
- Reduces API calls by 90%
- No credit savings, but significant time savings

**Impact:**
- 238 keywords = 24 API calls (vs 238 individual calls)
- Time savings: ~4 minutes (vs ~40 minutes)

### 2. Domain-Level Data Reuse

**Implementation:**
- Collect domain-level data once
- Store in shared location: `docs/content/blog/domain-level-data/sistrix-domain-data.json`
- Reference in all post documentation

**Impact:**
- Saves ~2,475 credits (if collected per-post: 99 posts × 25 credits)
- One-time cost: ~252 credits

### 3. SERP Data Strategy

**Option A: Collect Top 20 Keywords**
- Cost: 2,000 credits
- Benefit: Detailed SERP analysis for high-value keywords
- Use case: Manual review and competitive analysis

**Option B: Use GSC Data (Recommended)**
- Cost: 0 credits (already collected)
- Benefit: Real search performance data
- Use case: Search performance analysis

**Recommendation:** Option B - Use GSC data for search performance, skip expensive SERP collection

### 4. Credit Tracking

**Current System:**
- Tracks daily usage
- Needs update for weekly limits

**Update Required:**
- Change `daily_credit_limit` to `weekly_credit_limit`
- Update credit log structure to track weekly usage
- Add weekly reset logic

## Collection Priority

### High Priority (Complete Today)

1. ✅ Keyword collection for 34 missing posts (1,190 credits)
2. ✅ Domain-level data collection (252 credits)
3. ✅ Update credit tracking for weekly limits

### Medium Priority (If Credits Available)

1. ⚠️ SERP collection for top 20 keywords (2,000 credits) - **Optional**
2. ⚠️ Additional keyword metrics (if needed)

### Low Priority (Future)

1. SERP collection for all keywords (too expensive)
2. Additional domain-level data (if needed)

## Risk Assessment

### Risk 1: SERP Collection Exceeds Budget

**Probability:** High (if collecting for all keywords)
**Impact:** High (exceeds weekly limit)
**Mitigation:** Collect only for top 20 keywords or skip entirely

### Risk 2: Batch Processing Fails

**Probability:** Low (tested and working)
**Impact:** Medium (slower collection, same credits)
**Mitigation:** Fallback to individual calls if batch fails

### Risk 3: Domain-Level Calls Cost More Than Estimated

**Probability:** Medium (depends on data returned)
**Impact:** Low (can adjust limits)
**Mitigation:** Monitor credit usage, adjust limits if needed

## Recommendations

1. ✅ **Implement batch processing** for keyword metrics (saves time)
2. ✅ **Collect domain-level data once** (saves credits)
3. ⚠️ **Skip SERP collection** (too expensive, use GSC data instead)
4. ✅ **Update credit tracking** for weekly limits
5. ✅ **Monitor credit usage** throughout collection

## Next Steps

1. Update config for weekly credit limits
2. Implement batch processing in collection scripts
3. Create domain-level data collection script
4. Collect missing keyword data using batch mode
5. Collect domain-level data once
6. Update documentation generation to use domain-level data
7. Document SERP collection strategy (skip, use GSC)
