All articles
proxiesauthenticationsecurity

Proxy Authentication: IP Whitelist vs User/Pass vs API

JL
James Liu
Lead Engineer @ ProxyLabs
March 15, 2026
7 min read
Share

Proxy authentication determines how the proxy server verifies that you're authorized to route traffic through it. The three common methods — IP whitelisting, username/password, and API tokens — each have real trade-offs in security, flexibility, and ease of integration. Picking the wrong one for your setup can mean either security holes or constant authentication failures.

The Three Methods Compared

FactorIP WhitelistUsername/PasswordAPI Token
Setup complexityLowLowMedium
Works from dynamic IPsNoYesYes
Works from cloud/serverlessOnly if IP is staticYesYes
Credential rotationChange IP listChange passwordRotate token
Per-request customizationNoneUsername modifiersQuery params
Protocol supportHTTP/HTTPS/SOCKS5HTTP/HTTPS/SOCKS5HTTP only (REST)
Shared team accessAdd each member's IPShare credentialsPer-user tokens
Security if credential leaksN/A — IP-boundFull access until changedRevoke single token

IP Whitelisting

IP whitelisting authorizes traffic based on the source IP of the connection to the proxy server. You register your server's IP address in the proxy dashboard, and any request from that IP goes through without additional credentials.

How It Works

Your Server (IP: 203.0.113.50)
    │
    ├── Connects to proxy gateway
    │   Gateway checks: Is 203.0.113.50 whitelisted? ✓
    │
    └── Request routed through residential IP

Setup

# No auth needed in the request — your IP is pre-authorized
curl -x http://gate.proxylabs.app:8080 https://httpbin.org/ip

# Python — no username/password required
import requests

proxy = {
    'http': 'http://gate.proxylabs.app:8080',
    'https': 'http://gate.proxylabs.app:8080',
}

r = requests.get('https://httpbin.org/ip', proxies=proxy)
print(r.json())

When IP Whitelisting Works Well

  • Dedicated servers: Your scraping server has a static IP that never changes. Whitelist it once and forget about it.
  • Simple scripts: No need to handle credentials in code — reduces the chance of accidentally committing passwords to git.
  • High-throughput pipelines: No auth header overhead (though this is microseconds — not a real performance factor).

When IP Whitelisting Fails

  • Home connections: Most ISPs assign dynamic IPs. Your whitelist breaks when the IP changes (could be daily or weekly).
  • Cloud functions: AWS Lambda, Google Cloud Functions, and Vercel Edge Functions don't have predictable IPs. Whitelisting entire CIDR ranges is insecure.
  • Laptop development: Your IP changes every time you switch networks (office, home, coffee shop).
  • Team environments: Every developer needs their IP whitelisted. Someone's IP changes, scraper breaks at 2 AM.

The Dynamic IP Problem

If your IP is semi-dynamic (changes infrequently), you can auto-update the whitelist:

import requests

def update_whitelist(api_key, new_ip):
    """Example of auto-updating whitelist via provider API."""
    response = requests.post(
        'https://api.proxylabs.app/v1/whitelist',
        headers={'Authorization': f'Bearer {api_key}'},
        json={'ip': new_ip}
    )
    return response.status_code == 200

# Get current public IP and update whitelist
my_ip = requests.get('https://httpbin.org/ip').json()['origin']
update_whitelist('your-api-key', my_ip)

But this is a fragile workaround. If your use case involves dynamic source IPs, username/password auth is the better choice.

Username/Password Authentication

The most common method for residential proxies. You include credentials in each proxy request, either in the URL or via the Proxy-Authorization header.

How It Works

Your Server (any IP)
    │
    ├── Connects to proxy gateway
    │   Sends: Proxy-Authorization: Basic base64(username:password)
    │   Gateway validates credentials ✓
    │
    └── Request routed through residential IP

Basic Usage

# cURL
curl -x http://your-username:[email protected]:8080 https://httpbin.org/ip

# With country targeting via username modifier
curl -x http://your-username-country-US:[email protected]:8080 https://httpbin.org/ip
import requests

# Rotating proxy
proxy = {
    'http': 'http://your-username:[email protected]:8080',
    'https': 'http://your-username:[email protected]:8080',
}

# Geo-targeted + sticky session
proxy_sticky_gb = {
    'http': 'http://your-username-country-GB-session-sess01:[email protected]:8080',
    'https': 'http://your-username-country-GB-session-sess01:[email protected]:8080',
}

The Killer Feature: Username Modifiers

Username/password auth enables per-request customization through username modifiers. This is why most scraping setups prefer it — you can control geo-targeting, session persistence, and rotation behavior on every request without changing your proxy configuration.

ModifierFormatExample
Country-country-XXyour-username-country-US
City-city-CityNameyour-username-city-London
Session (sticky IP)-session-IDyour-username-session-abc123
CombinedChain modifiersyour-username-country-DE-city-Berlin-session-xyz
import uuid

# Different IP per request — default behavior
proxy_rotating = 'http://your-username:[email protected]:8080'

# Same IP for 30 minutes
session_id = uuid.uuid4().hex[:12]
proxy_sticky = f'http://your-username-session-{session_id}:[email protected]:8080'

# US IP from Chicago
proxy_geo = 'http://your-username-country-US-city-Chicago:[email protected]:8080'

For a detailed comparison of when to use rotating vs sticky sessions, see rotating vs sticky proxies.

Security Considerations

Environment variables: Never hardcode credentials. Use environment variables or a secrets manager.

import os

PROXY_USER = os.environ['PROXY_USER']
PROXY_PASS = os.environ['PROXY_PASS']

proxy = {
    'http': f'http://{PROXY_USER}:{PROXY_PASS}@gate.proxylabs.app:8080',
    'https': f'http://{PROXY_USER}:{PROXY_PASS}@gate.proxylabs.app:8080',
}

Credential exposure in logs: Many HTTP libraries log the full URL, including credentials. Either sanitize your logs or use the Proxy-Authorization header directly:

import requests
import base64

proxy = {'http': 'http://gate.proxylabs.app:8080', 'https': 'http://gate.proxylabs.app:8080'}
auth = base64.b64encode(b'your-username:your-password').decode()

session = requests.Session()
session.proxies = proxy
session.headers['Proxy-Authorization'] = f'Basic {auth}'

# Logs won't show credentials in the proxy URL
r = session.get('https://httpbin.org/ip')

Browser-Based Auth

Selenium and Puppeteer handle proxy auth differently than HTTP libraries. See our specific guides:

API Token Authentication

Some proxy providers expose their proxy pool through a REST API instead of (or in addition to) a gateway server. You send an HTTP request to the API with your token, and it returns either a proxy list or the proxied response directly.

How It Works

Your Server
    │
    ├── GET https://api.provider.com/v1/proxy?url=target.com
    │   Header: Authorization: Bearer <api-token>
    │
    └── API returns proxied response (or proxy endpoint)

Typical Usage

import requests

API_TOKEN = 'your-api-token'

# Some providers proxy the request for you
response = requests.get(
    'https://api.provider.com/v1/scrape',
    headers={'Authorization': f'Bearer {API_TOKEN}'},
    params={
        'url': 'https://example.com/product/123',
        'country': 'US',
        'render_js': 'false',
    }
)

print(response.json())

When API Tokens Make Sense

  • Serverless environments: Lambda/Cloud Functions where you can't maintain persistent proxy connections.
  • Team management: Issue per-developer tokens, revoke individually without affecting others.
  • Usage tracking: API calls are inherently trackable — easy to see who's using what.

Downsides

  • No protocol flexibility: API tokens only work over HTTP. You can't use them with SOCKS5, browser automation tools, or non-HTTP protocols.
  • Higher latency: Extra hop through the API layer adds 50-200ms per request compared to direct gateway connections.
  • Vendor lock-in: Your code calls a provider-specific API. Switching providers means rewriting integration code. With gateway proxies (username/password auth), switching is just changing a URL.

Which Should You Use?

If you scrape from a fixed server: Start with IP whitelisting for simplicity. Add username/password auth if you need geo-targeting or session control.

If you scrape from variable IPs (cloud, home, team): Username/password auth. It works everywhere and gives you per-request control through username modifiers.

If you use serverless functions: Username/password auth works in most cases. API tokens if your provider offers them and you want per-function usage tracking.

If you manage a team: API tokens for individual accountability. Username/password with shared credentials if you trust the team and want simplicity.

For most developers using ProxyLabs, username/password auth with the gateway at gate.proxylabs.app:8080 is the right default. It works with every tool (cURL, Python, Node.js, Selenium, Puppeteer, Playwright, Scrapy), supports geo-targeting and session control via username modifiers, and doesn't break when your IP changes.

Test your connection with the Proxy Tester to verify everything is working before running at scale.

Ready to try the fastest residential proxies?

Join developers and businesses who trust ProxyLabs for mission-critical proxy infrastructure.

~200ms responseBest anti-bot bypass£2.50/GB
Start Building NowNo subscription required
proxiesauthenticationsecurityproxy setupresidential proxies
JL
James Liu
Lead Engineer @ ProxyLabs

Building proxy infrastructure since 2019. Previously failed at many things, now failing slightly less.

Found this helpful? Share it with others.

Share