All articles
case studySEOSERP tracking

Case Study: How an SEO Software Company Achieved 99.2% Success Rate Tracking 5M Keywords Daily

MC
Mike Chen
Founder @ ProxyLabs
January 28, 2026
7 min read
Share

Case Study: How an SEO Software Company Achieved 99.2% Success Rate Tracking 5M Keywords Daily

Company: RankRadar (name changed for privacy)
Industry: SEO software / SERP tracking
Challenge: Tracking 2M keywords daily with 60% Google blocks and inaccurate data
Result: 99.2% success rate, scaled to 5M keywords, customer churn dropped 75%


The Problem

RankRadar provides SERP tracking and rank monitoring for SEO agencies and enterprises. Their platform tracks keyword positions across Google, Bing, and local search results for thousands of clients.

When they approached us, their infrastructure was crumbling:

  • 2M keywords tracked daily across global and local search results
  • 60% request failure rate due to Google blocking
  • Unreliable rank data causing client disputes and churn
  • 3 different proxy providers costing $8K/month total
  • Customer churn rate of 12% per month due to data accuracy issues
  • NPS score of 23 - clients were frustrated with inconsistent rankings

Their CTO explained: "Our clients pay for accurate rank tracking, but we're delivering garbage data. Google blocks us constantly, and we're burning through proxies faster than we can buy them."

The Root Cause Analysis

We conducted a 72-hour audit of their scraping pipeline. Here's what we uncovered:

Problem 1: Insufficient Geographic Coverage

They were using generic proxy pools without proper geographic targeting:

Test: 500 local search queries for "plumber New York"
Result:
- 280 returned results for wrong locations (56%)
- 120 blocked due to IP-location mismatch (24%)
- 100 successful with accurate local results (20%)

Google's local SERP algorithm heavily weights user location. Using proxies from random countries returned irrelevant results.

Problem 2: Aggressive Scraping Patterns

Their scraper treated all search engines equally:

  • 100 concurrent requests per domain
  • No geographic delays or session management
  • Identical user agents across all requests
  • No handling for Google's anti-bot measures

This triggered Google's detection systems immediately, especially for local searches requiring residential IP patterns.

Problem 3: Provider Fragmentation

Managing 3 proxy providers created complexity:

  1. Different APIs and authentication
  2. Inconsistent IP quality across providers
  3. No unified session health tracking
  4. Higher total costs due to minimum commitments

When one provider's IPs got burned, they couldn't seamlessly rotate to better ones.

The Solution

We migrated them to ProxyLabs private pools over 4 weeks. Here's the transformation:

Phase 1: Geographic Targeting Implementation

Switched to private residential pools with city-level targeting:

| Metric | Generic Pools (Before) | Geographic Pools (After) | |--------|-----------------------|--------------------------| | Local result accuracy | 20% | 96% | | Location mismatch rate | 56% | under 2% | | Google block rate | 60% | 8% |

Code example for geographic SERP scraping:

class GeoSerpScraper:
    def __init__(self, proxy_manager):
        self.proxy_manager = proxy_manager
        self.location_db = LocationDatabase()
        
    async def search_with_location(self, keyword, city, state):
        # Get residential proxy for specific city
        proxy = self.proxy_manager.get_city_proxy(city, state)
        
        # Set geolocation headers
        headers = {
            'User-Agent': self.generate_realistic_ua(),
            'Accept-Language': 'en-US,en;q=0.9',
            'X-Forwarded-For': proxy.ip,  # Residential IP
        }
        
        # Google search URL with location parameters
        search_url = f"https://www.google.com/search?q={keyword}&uule={self.encode_location(city, state)}"
        
        response = await self.make_request(search_url, headers, proxy)
        
        # Parse local SERP results
        results = self.parse_serp(response, keyword)
        
        return results

Phase 2: Smart Rotation with Session Health

Implemented session-based rotation with health tracking:

class SessionHealthTracker:
    def __init__(self):
        self.sessions = {}  # session_id -> health_score
        self.ip_health = {}  # ip -> success_rate
        
    async def get_healthy_session(self, domain, location):
        # Find session with high health score for this location
        candidates = [
            s for s in self.sessions.values() 
            if s.location == location and s.health_score > 0.8
        ]
        
        if candidates:
            return max(candidates, key=lambda s: s.health_score)
        
        # Create new session with fresh residential IP
        return self.create_new_session(domain, location)
    
    def update_health(self, session_id, success, block_type=None):
        session = self.sessions[session_id]
        
        if success:
            session.health_score = min(1.0, session.health_score + 0.1)
            session.success_count += 1
        else:
            session.health_score *= 0.7  # Decay on failure
            if block_type == 'captcha':
                session.health_score *= 0.5  # Major penalty

Key improvements:

  • Residential IP priority: Only residential IPs for Google searches
  • Location-sticky sessions: Same IP/location combo for 15-20 minutes
  • Health-based rotation: Proactively rotate before IPs get flagged
  • Batch processing: Process keywords in location-based batches

Phase 3: Batch Processing for Scale

Optimized for millions of keywords with intelligent queuing:

async def process_keyword_batch(self, keywords, locations):
    # Group keywords by location for efficiency
    location_batches = self.group_by_location(keywords, locations)
    
    results = []
    for location, batch in location_batches.items():
        # Get healthy session for this location
        session = await self.session_tracker.get_healthy_session('google.com', location)
        
        # Process batch with controlled concurrency
        batch_results = await self.process_batch_concurrent(batch, session, max_concurrent=5)
        
        results.extend(batch_results)
        
        # Update session health based on batch performance
        success_rate = len([r for r in batch_results if r.success]) / len(batch_results)
        self.session_tracker.update_batch_health(session.id, success_rate)
    
    return results

This approach scaled them from 2M to 5M keywords while maintaining high accuracy.

The Results

After migration and optimization:

Performance Metrics

| Metric | Before | After | Change | |--------|--------|-------|--------| | Daily keywords tracked | 2M | 5M | +150% | | Success rate | 40% | 99.2% | +148% | | Local result accuracy | 20% | 96% | +380% | | Google block rate | 60% | 0.8% | -99% | | Avg response time | 8.5s | 2.1s | -75% |

Cost Breakdown

| Cost Category | Before | After | Savings | |---------------|--------|-------|---------| | Proxy costs | $8,000/mo | $4,500/mo | $3,500 | | Multiple provider management | $1,200/mo | $0 | $1,200 | | Total | $9,200/mo | $4,500/mo | $4,700 (51%) |

Business Impact

  • Customer churn: Dropped from 12% to 3% per month
  • NPS score: Improved from 23 to 67
  • New client acquisition: 40% increase due to reliable local tracking
  • Data accuracy: 96% local SERP accuracy vs 20% before

Key Takeaways

1. Geographic Targeting Is Essential for Local SERP

Google's algorithm personalizes results based on user location. Generic proxies from random countries return irrelevant data. City-level residential targeting is crucial for accurate local rank tracking.

2. Residential IPs Beat Detection Better

Google's anti-bot systems are tuned to detect datacenter IPs. Residential IPs from real ISPs have much higher success rates and lower block rates, especially for local searches.

3. Session Health Tracking Prevents Blocks

Monitoring IP and session performance allows proactive rotation before blocks occur. This maintains high success rates even at scale.

4. Consolidation Reduces Complexity

Managing multiple proxy providers adds overhead. A single high-quality provider with comprehensive features simplifies infrastructure and reduces costs.

Technical Architecture (Final State)

┌─────────────────────────────────────────────────────────┐
│                 Keyword Queue                            │
│           (Redis, 5M keywords/day)                       │
└─────────────────────┬───────────────────────────────────┘
                      │
                      ▼
┌─────────────────────────────────────────────────────────┐
│              Location Batcher                           │
│  - Group by city/state                                  │
│  - Batch processing optimization                        │
└─────────────────────┬───────────────────────────────────┘
                      │
                      ▼
┌─────────────────────────────────────────────────────────┐
│              Session Health Tracker                     │
│  - Residential IP pools by location                     │
│  - Health scoring and rotation                          │
│  - 15-20 min sticky sessions                            │
└─────────────────────┬───────────────────────────────────┘
                      │
                      ▼
┌─────────────────────────────────────────────────────────┐
│               ProxyLabs Private Pools                   │
│  - 12M residential IPs                                  │
│  - City-level geographic targeting                      │
│  - Real-time health monitoring                          │
└─────────────────────┬───────────────────────────────────┘
                      │
                      ▼
┌─────────────────────────────────────────────────────────┐
│               Google/Bing SERP                          │
│  - Global and local search results                      │
│  - Accurate rank positions                              │
└─────────────────────────────────────────────────────────┘

Conclusion

RankRadar's success wasn't about finding more proxies. It was about using the right proxies intelligently:

  1. Geographic precision for local SERP accuracy
  2. Residential quality to bypass Google's detection
  3. Smart session management for sustained success rates
  4. Consolidation for operational simplicity

The result: Reliable rank tracking at scale, happy customers, and 51% cost reduction.


Ready to achieve similar results? RankRadar started with a targeted location trial. Start your trial at proxylabs.net/dashboard.

Ready to try the fastest residential proxies?

Join developers and businesses who trust ProxyLabs for mission-critical proxy infrastructure.

~200ms responseBest anti-bot bypass£2.50/GB
Start Building NowNo subscription required
case studySEOSERP tracking
MC
Mike Chen
Founder @ ProxyLabs

Building proxy infrastructure since 2019. Previously failed at many things, now failing slightly less.

Found this helpful? Share it with others.

Share