You changed your IP. The anti-bot system doesn’t care.

You’re paying Bright Data $25+ per 1,000 requests for residential proxy rotation. New IP every request. Clean IPs. Real ISPs. And you’re still getting blocked.

Here’s why: the anti-bot system isn’t looking at your IP address. It’s looking at your TLS fingerprint — a unique signature generated during the TLS handshake that identifies exactly what software is making the request. And no amount of proxy rotation changes it.

This is the dirty secret of the entire proxy industry. Bright Data, ScraperAPI, Oxylabs, ZenRows, Apify — they all sell you IP rotation as the solution to anti-bot detection. They know TLS fingerprinting makes IP rotation irrelevant against modern anti-bot systems. They sell it to you anyway.

What is a TLS fingerprint?

When your browser connects to a website over HTTPS, it performs a TLS handshake. During this handshake, the client (your browser or scraping tool) sends a ClientHello message that contains:

  • TLS version — which versions of TLS the client supports
  • Cipher suites — the encryption algorithms the client supports, in order of preference
  • Extensions — TLS extensions like SNI, ALPN, supported groups, signature algorithms
  • Elliptic curves — which curves the client supports for key exchange
  • Point formats — how elliptic curve points are encoded

Every piece of software constructs this ClientHello differently. Chrome constructs it one way. Firefox another. Python’s requests library another. Node.js another. Puppeteer another.

JA3 and JA4: Hashing the handshake

JA3 (created by Salesforce researchers) takes five fields from the ClientHello — TLS version, cipher suites, extensions, elliptic curves, and point formats — concatenates them, and generates an MD5 hash. This hash is the JA3 fingerprint.

JA4 is the next generation, created by the same team. It’s more granular, includes additional fields, and is designed to be more resilient to randomization attempts.

Here’s what matters: every HTTP client has a known JA3/JA4 hash. Anti-bot vendors maintain databases of these hashes:

ClientDetection
Chrome 120+ (real)Known good fingerprint, passes
Python requestsKnown bot fingerprint, blocked
Node.js fetch/axiosKnown bot fingerprint, blocked
Go net/httpKnown bot fingerprint, blocked
Puppeteer (headless Chrome)Slightly different from real Chrome, flagged
PlaywrightDifferent from real Chrome, flagged
curlKnown fingerprint, blocked on protected sites

When Akamai Bot Manager sees a request claiming to be Chrome 121 via the User-Agent header but sending a JA3 hash that matches Python requests — instant block. The request never reaches the server.

How Akamai and Cloudflare use TLS fingerprints

Akamai Bot Manager checks the TLS fingerprint as the very first step of bot detection — before the page even loads, before JavaScript runs, before any browser fingerprinting happens. If your TLS fingerprint doesn’t match a known legitimate browser, you’re blocked at the edge. Your request is rejected at the CDN level. You never get HTML back.

Cloudflare does the same. Their bot management platform uses JA3 fingerprinting combined with HTTP/2 fingerprinting (the SETTINGS frame, WINDOW_UPDATE values, header ordering, and priority tree structure) to create a comprehensive connection-level fingerprint.

This is why you see instant 403s on some sites — the anti-bot system didn’t even need to run JavaScript challenges. Your TLS handshake told them everything they needed to know.

Why changing IP doesn’t change TLS fingerprint

This is the critical point that proxy providers don’t want you to understand:

Your TLS fingerprint is generated by your client software, not your network. When you route a Python request through a Bright Data residential proxy, the proxy changes the source IP. It does not change the ClientHello. It does not change the cipher suites. It does not change the TLS extensions.

The request arrives at the target server from a clean residential IP — with a TLS fingerprint that screams “Python bot.”

This is true for every proxy provider:

  • Bright Data — Routes your requests through residential/datacenter proxies. Your client’s TLS fingerprint passes through untouched. If you’re using Python, Node.js, or even their SDK, the TLS fingerprint is detectable.

  • ScraperAPI — Same architecture. Your requests get a new IP. Your TLS fingerprint stays the same. Akamai blocks it before the page loads.

  • Oxylabs — Same story. 100M+ residential IPs doesn’t matter when every request has the same bot TLS fingerprint.

  • ZenRows — Claims to handle anti-bot detection. Can’t change the laws of TLS. Their requests carry detectable fingerprints on Akamai-protected sites.

  • Apify — Their proxy infrastructure changes IPs. The TLS fingerprint from their crawler environment is cataloged and blocked.

You’re paying for millions of IP addresses. The anti-bot system ignores the IP and reads the TLS fingerprint. You’re paying for the wrong thing.

”But Bright Data uses headless Chrome — doesn’t that fix the TLS fingerprint?”

Partially. When Bright Data’s Web Unlocker uses a headless Chrome instance, the TLS fingerprint is Chrome’s fingerprint — which is better than Python’s. But there are problems:

  1. Headless Chrome’s TLS fingerprint can differ from headed Chrome. Depending on the Chromium build, compilation flags, and configuration, the TLS handshake may contain subtle differences. Anti-bot systems have cataloged these differences.

  2. HTTP/2 fingerprinting compounds the problem. Even if the TLS fingerprint is perfect, the HTTP/2 connection parameters (SETTINGS frame values, pseudo-header order, priority scheme) are different in headless Chrome vs. real Chrome. Akamai and Cloudflare check both.

  3. The fingerprint is only the first gate. Even if Bright Data gets past TLS fingerprinting, browser fingerprinting (canvas, WebGL, fonts) catches the headless browser at the next layer. You need to pass every check, not just one.

TLS fingerprint spoofing: harder than it sounds

Some developers try to spoof TLS fingerprints using libraries like curl-impersonate, tls-client, or custom Golang clients with modified TLS stacks. This works against basic checks but fails against sophisticated anti-bot systems because:

  • Spoofing JA3 alone isn’t enough. Anti-bot systems check JA3 + HTTP/2 fingerprint + header order + browser fingerprint together. Matching one while mismatching others is a detection signal.

  • TLS extension ordering matters. Anti-bot systems don’t just check which extensions are present — they check the order. Real Chrome has a specific order that changes with each version. Keeping up with every Chrome version’s exact extension order is a full-time job.

  • Chrome adds randomization intentionally. Starting with Chrome 110, Google began randomizing certain TLS extension orders specifically to make fingerprinting harder. But the randomization has patterns that differ from manual randomization attempts. Anti-bot systems can distinguish Chrome’s native randomization from artificial randomization.

The only real solution: use a real browser

TLS fingerprinting is a connection-level check. You can’t solve it with headers. You can’t solve it with proxies. You can’t solve it with most spoofing libraries.

The only way to produce a legitimate Chrome TLS fingerprint is to use actual Chrome. Not headless Chrome. Not Chromium. Not a modified build. The real, unmodified Chrome browser.

UltraWebScrapingAPI uses real Chrome browsers. When our browser connects to an Akamai-protected site:

  • The TLS ClientHello is identical to what your personal Chrome browser sends
  • The HTTP/2 SETTINGS frame matches real Chrome exactly
  • The header ordering matches real Chrome exactly
  • The browser fingerprint matches real Chrome exactly

Every layer of detection sees a real browser because it is a real browser. There’s nothing to detect.

Stop paying for proxy rotation that doesn’t work

The proxy rotation industry is selling you a solution to a problem that was solved by anti-bot systems in 2022. IP rotation was effective when anti-bot detection was IP-based. It’s 2026. Detection is fingerprint-based.

Bright Data’s 72M residential IPs don’t help when Akamai reads the TLS fingerprint. ScraperAPI’s “render=true” mode doesn’t help when Cloudflare checks the HTTP/2 connection parameters. Oxylabs’ datacenter proxies definitely don’t help.

You need a fundamentally different approach. You need real browsers with real TLS fingerprints. You need UltraWebScrapingAPI.


See what a real TLS fingerprint looks like in action. Test any protected URL in our playground and watch it pass where Bright Data fails — Try the Playground.