HTTP LibraryJavaScript

How to Use Proxies with axios

Axios is the leading promise-based HTTP client for the Node.js ecosystem, but its built-in proxy configuration has significant limitations when dealing with authenticated residential gateways. While it provides a basic proxy object, it often fails to correctly establish HTTPS tunnels or inject the required Proxy-Authorization headers in certain Node.js environments. For production-grade scraping and API monitoring, we recommend pairing Axios with the https-proxy-agent package. This combination allows for a clean separation between the request logic and the transport layer, ensuring that the HTTP CONNECT method is correctly utilized for end-to-end encryption. Axios's powerful interceptor system makes it an excellent choice for building resilient scraping pipelines that can automatically handle transient proxy failures, rotate credentials dynamically, and implement sophisticated retry logic without cluttering your core business logic.

Focus: working config first, then the mistakes that usually cause traffic to bypass the proxy or break under concurrency.

Using Proxies with axios: What to Know

Axios routes its traffic through the Node.js http and https modules, but its high-level abstraction can sometimes obscure the details of proxy tunneling. When you configure an HttpsProxyAgent, it takes over the responsibility of establishing the TCP connection to the ProxyLabs gateway. For HTTPS requests, it sends an HTTP CONNECT request, creating a transparent pipe between your Node.js process and the target server. This means Axios itself only handles the application-layer request once the secure tunnel has been established by the agent.

A common point of confusion is the interaction between Axios's built-in proxy settings and custom agents. If you provide both, Axios may attempt to double-proxy or malform the request headers. The most robust pattern is to set the 'proxy' field to false in your Axios config and let the HttpsProxyAgent handle all the routing. This ensures that the Proxy-Authorization headers are correctly placed in the CONNECT request where the gateway expects them, rather than in the final request to the target site.

Connection pooling in Node.js is managed by the agent. By sharing an HttpsProxyAgent instance across multiple requests, you allow the agent to reuse existing TCP connections to the ProxyLabs gateway. This is particularly beneficial for residential proxies, as the initial handshake can be the most time-consuming part of the request. However, if you're using a rotating gateway without session IDs, you may need to control when connections are closed to ensure you're getting a fresh IP as expected.

The Node.js event loop requires that all I/O operations are non-blocking. When a proxy connection is slow or a residential IP goes offline, your Axios requests might hang. Without a properly configured timeout, these hanging requests can accumulate, eventually reaching the operating system's limit for open file descriptors. Always pair your proxy configuration with both a connection timeout and a response timeout to ensure your application remains responsive even when the network is unstable.

Axios interceptors are a powerful tool for managing proxy-related logic globally. You can use a request interceptor to inject dynamic credentials based on the target country or a session ID. Simultaneously, a response interceptor can inspect for 407 or 502 status codes and initiate a retry with a new proxy configuration. This keeps your main application logic focused on data processing, while the network-level complexities are handled in a centralized and maintainable way.

When using Axios for large-scale scraping, memory management becomes a priority. By default, Axios buffers the entire response body in memory. For thousands of concurrent requests, this can lead to high RAM usage or even crashes. Using the 'responseType: stream' option allows you to process the data as it arrives, which is significantly more efficient for memory usage. This is especially important when your residential proxy connections have high latency, as the data may arrive in small chunks over a long period.

Finally, always consider the TLS fingerprint when scraping protected targets. Node.js's default TLS implementation is easily identifiable by anti-bot services like Cloudflare or Akamai. While residential proxies mask your IP address, the TLS handshake can still give away that you're using a script. If you find your requests being blocked despite using high-quality proxies, you may need to look into advanced techniques for mimicking a browser's TLS fingerprint within the Node.js environment.

Installation

npm install axios https-proxy-agentcopy to clipboard

Working Examples

Rotating Proxyjavascript
const axios = require('axios');
const { HttpsProxyAgent } = require('https-proxy-agent');

const agent = new HttpsProxyAgent(
  'http://your-username:[email protected]:8080'
);

async function fetchWithProxy() {
  try {
    const response = await axios.get('https://httpbin.org/ip', {
      httpAgent: agent,
      httpsAgent: agent,
      timeout: 30000,
    });
    console.log(response.data);
  } catch (error) {
    console.error('Request failed:', error.message);
  }
}

fetchWithProxy();
Sticky Sessionjavascript
const axios = require('axios');
const { HttpsProxyAgent } = require('https-proxy-agent');

const agent = new HttpsProxyAgent(
  'http://your-username-session-abc123:[email protected]:8080'
);

const client = axios.create({
  httpAgent: agent,
  httpsAgent: agent,
  timeout: 30000,
});

async function scrapePages() {
  for (let page = 1; page <= 5; page++) {
    try {
      const resp = await client.get(`https://example.com/page/${page}`);
      console.log(`Page ${page}: ${resp.status}`);
    } catch (error) {
      console.error(`Page ${page} failed: ${error.message}`);
    }
  }
}

scrapePages();
Geo-Targeted Requestjavascript
const axios = require('axios');
const { HttpsProxyAgent } = require('https-proxy-agent');

const agent = new HttpsProxyAgent(
  'http://your-username-country-US-city-NewYork:[email protected]:8080'
);

async function geoTargetedRequest() {
  try {
    const response = await axios.get('https://httpbin.org/ip', {
      httpAgent: agent,
      httpsAgent: agent,
      timeout: 30000,
    });
    console.log('NYC IP:', response.data);
  } catch (error) {
    console.error('Request failed:', error.message);
  }
}

geoTargetedRequest();

What matters in practice

  • Custom instance creation using axios.create() to share proxy configurations and default headers across a whole module.
  • Sophisticated request and response interceptors that allow for global handling of proxy authentication and automatic retries on gateway errors.
  • Native support for canceling ongoing proxy requests via AbortController, which is vital for preventing memory leaks in high-concurrency Node.js apps.
  • Automatic transformation of response data from JSON, enabling cleaner data processing pipelines without manual parsing steps.
  • Support for custom HTTP and HTTPS agents, providing the flexibility needed to handle complex tunneling scenarios or SOCKS5 configurations.
  • Granular control over timeout settings, allowing separate limits for the initial connection handshake and the total request duration.

Operational Notes

01

Avoid using the built-in 'proxy' configuration object in Axios when working with authenticated residential proxies. Instead, use https-proxy-agent and pass it to both the httpAgent and httpsAgent fields to ensure consistent behavior across different Node.js versions.

02

Implement a response interceptor to automatically retry requests when the proxy gateway returns a 502 Bad Gateway or 504 Gateway Timeout error. These are usually temporary residential peer dropouts that resolve instantly with a fresh IP address.

03

Set a strict 'timeout' in your Axios configuration. In the Node.js event loop, an unmanaged proxy request that hangs can prevent your process from exiting and eventually lead to memory exhaustion as connections accumulate.

04

Share a single HttpsProxyAgent instance across multiple requests to benefit from the agent's internal connection pooling and keep-alive mechanisms, which significantly reduces the latency of repeated requests.

05

When using interceptors for retries, always include a maximum retry counter in the request configuration to avoid infinite loops when a target site is truly down or your proxy credentials have expired.

06

If you need to change your proxy location or session ID dynamically, create a factory function that returns a new Axios instance with a freshly configured agent for each logical scraping task.

Frequently Asked Questions

Why do I receive a 407 Proxy Authentication Required error with Axios?

The most common reason for a 407 error in Axios is the incorrect formatting of the Proxy-Authorization header. Node.js's built-in HTTP client sometimes strips these headers during the CONNECT tunnel phase. By using the https-proxy-agent library, you ensure that the authentication credentials are correctly injected into the initial handshake with the ProxyLabs gateway. Additionally, verify that your username and password do not contain special characters that might be misinterpreted by the URL parser; URL-encoding these credentials is a reliable way to avoid this issue.

Can I use Axios with these proxies in a browser environment?

No, you cannot use Axios with residential proxies directly in the browser due to security restrictions like CORS and the lack of low-level socket access for proxy tunneling. Residential proxies must be used in a Node.js backend, a serverless function, or via a dedicated browser extension. If you need to scrape data from a client-side application, you should route your requests through a backend API that handles the proxy communication and returns the sanitized data to your frontend.

How do I handle self-signed certificates when scraping through a proxy?

If the target site uses a self-signed certificate, you can configure your HttpsProxyAgent to ignore certificate validation errors by passing the rejectUnauthorized: false option. However, this is strongly discouraged for production environments as it exposes your traffic to potential man-in-the-middle attacks. A better approach is to provide the target's CA certificate to the agent's configuration, ensuring that you maintain a secure and verified connection even when scraping unconventional targets.

Nearby Guides

Need residential IPs for axios?

Get access to 30M+ residential IPs in 195+ countries. Pay-as-you-go from £2.50/GB. No subscriptions, no commitments.

GET STARTED

Related guides