All articles
http headersheader orderanti-bot bypass

HTTP Header Order: Why It Gets You Blocked and How to Fix It

JL
James Liu
Lead Engineer @ ProxyLabs
March 16, 2026
9 min read
Share

You have the perfect residential proxy from gate.proxylabs.app:8080. You copied the User-Agent string from your own browser. You even matched the TLS fingerprint using a specialized library. Yet, the moment you send the request to a high-security target like Cloudflare or DataDome, you get a 403 Forbidden or an infinite CAPTCHA loop.

The reason is often hidden in the sequence of your headers. While the HTTP specification technically treats headers as unordered, real browsers follow strict, hard-coded patterns. In 2026, anti-bot systems don't just check if a header exists. They check where it sits in the list. If your "User-Agent" appears before "Host" in an HTTP/1.1 request, you aren't a browser. You're a script.

This guide breaks down exactly how header order, pseudo-headers, and the newer Client Hints (Sec-Ch-Ua) define your digital identity. We'll look at the exact sequences used by Chrome 131 and 140+, the pseudo-header differences between engines, and how to fix your implementation in Python and Node.js.

The Mechanics of HTTP Header Ordering

In the early days of the web, header order didn't matter. Servers just looked for the specific keys they needed. Modern anti-bot engines reversed this. They treat the header sequence as a fingerprint. This fingerprint is generated by the browser's underlying engine (Blink for Chrome, Gecko for Firefox, WebKit for Safari).

Because these engines are updated on different schedules and have different internal architectures, they naturally produce headers in different orders. A scraper using the Python "requests" library sends headers in an order that matches nothing in the real world. This is an instant flag.

HTTP/1.1 vs HTTP/2 Ordering

HTTP/1.1 and HTTP/2 handle headers differently. In HTTP/1.1, headers are sent as plain text lines. The order is literally the sequence of these lines in the TCP stream.

In HTTP/2, headers are compressed using HPACK. While they are still conceptually a list of key-value pairs, HTTP/2 introduces "pseudo-headers." These are special headers that start with a colon (e.g., :method). They must appear before any regular headers in the block. If you send a regular header like user-agent before :path, the request is technically invalid and suspicious.

Pseudo-Headers: The HTTP/2 Fingerprint

Every HTTP/2 request begins with a set of pseudo-headers that define the request's core properties. Anti-bot systems use the specific order of these pseudo-headers to distinguish between browsers.

In 2026, the orders remain distinct across the major engines:

BrowserPseudo-Header Order
Chrome (Blink):method, :authority, :scheme, :path
Firefox (Gecko):method, :path, :authority, :scheme
Safari (WebKit):method, :scheme, :path, :authority

If your scraper claims to be Chrome in its User-Agent but sends :scheme before :authority, a security layer like Akamai will detect the Gecko/WebKit pattern and drop the connection. Most standard libraries like Python's httpx or Node's http2 do not give you easy control over this order. They often use alphabetical order or the order of insertion in a dictionary, both of which are incorrect for browser impersonation.

HTTP/1.1 Header Sequence in Chrome 131+

Even in HTTP/1.1, order is critical. While most high-traffic sites have moved to HTTP/2 or HTTP/3, they often fall back to HTTP/1.1 for certain types of traffic or when proxies are involved.

As of Chrome 131 and extending into current 140+ versions, the standard header sequence for a primary document navigation looks like this:

  1. Host
  2. Connection
  3. Content-Length (only for POST/PUT)
  4. sec-ch-ua
  5. sec-ch-ua-mobile
  6. sec-ch-ua-platform
  7. Upgrade-Insecure-Requests
  8. User-Agent
  9. Accept
  10. Sec-Fetch-Site
  11. Sec-Fetch-Mode
  12. Sec-Fetch-User
  13. Sec-Fetch-Dest
  14. Accept-Encoding
  15. Accept-Language

Compare this to the default behavior of the Python "requests" library: User-Agent, Accept-Encoding, Accept, Connection.

The discrepancy is massive. "requests" puts the User-Agent first. A real browser puts it in the middle, after the Client Hints and before the Fetch Metadata headers. This single difference is why many developers find themselves blocked even when using high-quality proxies.

The Client Hints Revolution (Sec-Ch-Ua)

Client Hints are the industry's replacement for the bulky and often spoofed User-Agent string. While the User-Agent still exists for backward compatibility, modern Chrome versions rely on the Sec-Ch-Ua family of headers for feature detection and site logic.

If you are scraping in 2026, you cannot ignore these. If they are missing, or if they contradict your User-Agent, you are marked as a bot.

The Sec-Ch-Ua Brand List

The sec-ch-ua header contains a list of "brands" and versions. It often looks like this: "Not A(Brand";v="8", "Chromium";v="131", "Google Chrome";v="131"

The presence of the "Not A(Brand" string is a deliberate attempt to prevent servers from relying on simple string matching, but its version and position are also tracked.

Consistency is Mandatory

The biggest mistake scrapers make with Client Hints is "partial spoofing." They update the User-Agent but forget the Client Hints.

Consider this scenario:

  • User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/144.0.0.0 Safari/537.36
  • Sec-Ch-Ua-Platform: "macOS"

You just told the server you are running Windows in the User-Agent, but macOS in the Client Hints. This is an impossible state. Furthermore, if your TLS fingerprint (JA4 or JA3) indicates a Windows-based TLS stack, but your headers say macOS, the entropy score for your request spikes.

Modern anti-bots look for this "entropy." They compare the OS version in the UA, the OS version in the Client Hints, the OS behavior in the TLS handshake, and even the TCP window size. All signals must agree.

Fixing Header Order in Python

The standard requests library uses an internal dictionary for headers, which often reorders them. To fix this, you need a library that supports "impersonation" or allows manual control over the internal ordering of both regular and pseudo-headers.

Option 1: curl_cffi

The curl_cffi library is currently the most effective way to handle this in Python. It uses a modified version of curl that can mimic the TLS and header behavior of specific browser versions.

from curl_cffi import requests

# Using the impersonate flag handles header order, 
# pseudo-headers, and TLS fingerprints automatically.
session = requests.Session(impersonate="chrome144")

response = session.get(
    "https://api.target.com/data",
    proxies={"http": "http://gate.proxylabs.app:8080", "https": "http://gate.proxylabs.app:8080"}
)

print(response.status_code)

By setting impersonate="chrome144", the library ensures that :method, :authority, :scheme, and :path are sent in the exact Blink order. It also populates the Sec-Ch-Ua headers correctly and places the User-Agent in the right sequence.

Option 2: tls-client (Go-based)

If you need even more control, tls-client (available as a Python wrapper) allows you to specify the pseudo-header order manually.

import tls_client

session = tls_client.Session(
    client_identifier="chrome_144",
    random_tls_extension_order=True
)

# You can manually override pseudo-header order if needed
session.pseudo_header_order = [":method", ":authority", ":scheme", ":path"]

res = session.get(
    "https://target.com",
    proxy="http://gate.proxylabs.app:8080"
)

Fixing Header Order in Node.js

Node.js developers often use axios or node-fetch. Neither of these libraries provides the level of control required for high-level anti-bot bypass. axios sits on top of Node's http module, which does not preserve header case or order by default.

Using Undici

undici is the modern HTTP client for Node.js. It allows you to pass headers as an array of strings, which preserves the order.

const { Client } = require('undici');

const client = new Client('https://target.com');

async function makeRequest() {
  const { statusCode, body } = await client.request({
    path: '/',
    method: 'GET',
    // Using an array preserves the exact order
    headers: [
      'Host', 'target.com',
      'sec-ch-ua', '"Google Chrome";v="144", "Chromium";v="144", "Not-A.Brand";v="24"',
      'sec-ch-ua-mobile', '?0',
      'sec-ch-ua-platform', '"Windows"',
      'User-Agent', 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36...',
      'Accept', 'text/html,application/xhtml+xml,xml;q=0.9,image/avif,webp,*/*;q=0.8',
      'Sec-Fetch-Site', 'none',
      'Sec-Fetch-Mode', 'navigate',
      'Sec-Fetch-User', '?1',
      'Sec-Fetch-Dest', 'document',
      'Accept-Encoding', 'gzip, deflate, br',
      'Accept-Language', 'en-US,en;q=0.9'
    ]
  });
}

While undici helps with regular headers, it still struggles with pseudo-header ordering in HTTP/2. For that, you should use a wrapper around tls-client or use a specialized library like cycle-http.

Debugging with Charles Proxy

To truly understand header order, you need to see what your browser is doing. Do not rely on "View Source" or the "Network" tab in DevTools. Those views often present headers in a cleaned, alphabetical, or logical format rather than the raw wire format.

  1. Set up Charles Proxy: Install Charles and enable SSL Proxying for your target domain.
  2. Visit the site in Chrome: Navigate to the target site. Look at the "Request" tab in Charles and switch to the "Headers" view.
  3. Inspect the Hex/Raw view: This shows you the actual sequence as it left the browser.
  4. Compare with your script: Run your Python or Node script through Charles. If you see the headers in a different sequence, you have found your block reason.

Pay close attention to capitalization. While HTTP/2 is case-insensitive (and actually requires lowercase), HTTP/1.1 is case-sensitive for anti-bot purposes. If your script sends user-agent instead of User-Agent in HTTP/1.1, you will be flagged.

The 8-Point Consistency Checklist

Before you send a request to a high-value target, verify these eight points. If even one is off, your risk score increases.

  1. Pseudo-Header Order: For HTTP/2, are you following the :method, :authority, :scheme, :path sequence for Chrome?
  2. Case Sensitivity: In HTTP/1.1, are you using the standard PascalCase (e.g., Accept-Language)?
  3. Sec-Ch-Ua Presence: Are you sending the sec-ch-ua, sec-ch-ua-mobile, and sec-ch-ua-platform headers?
  4. Internal UA Consistency: Does the version in your User-Agent string match the version in your sec-ch-ua brand list?
  5. Platform Alignment: Does your sec-ch-ua-platform (e.g., "Windows") match the OS in your User-Agent?
  6. Fetch Metadata: Are you sending Sec-Fetch-* headers? Browsers have sent these by default for years. A request without them is highly suspicious.
  7. TLS Alignment: Does your TLS fingerprint (ALPN, ciphers, extensions) match the browser version you are claiming to be? A Chrome 144 UA with a Chrome 131 TLS profile is a flag.
  8. Proxy Integrity: Is your proxy (gate.proxylabs.app:8080) adding any headers like X-Forwarded-For or Via? ProxyLabs proxies are transparent by default, but always verify.

Conclusion

Header order is no longer a "nice-to-have" optimization. It is a fundamental requirement for web scraping and automation in 2026. Anti-bot systems have evolved from simple blacklists to complex behavioral analysis engines. They treat your request as a set of signals that must all point to the same conclusion: a human using a standard browser.

By using tools like curl_cffi and undici, and by meticulously matching the Client Hints and pseudo-header sequences, you can significantly reduce your block rate. Consistency is the key. Your TLS fingerprint, your header order, and your Client Hints must form a unified, believable identity.

Ready to try the fastest residential proxies?

Join developers and businesses who trust ProxyLabs for mission-critical proxy infrastructure.

~200ms responseBest anti-bot bypass£2.50/GB
Start Building NowNo subscription required
http headersheader orderanti-bot bypassclient hintstls fingerprintingweb scrapinghttp2
JL
James Liu
Lead Engineer @ ProxyLabs

Building proxy infrastructure since 2019. Previously failed at many things, now failing slightly less.

Found this helpful? Share it with others.

Share