Unseen Scraper Attacks Leave WordPress Sites Crawling, Analytics Blind
Breaking: Hidden Traffic Flood Slows WordPress Sites, Evades Google Analytics
A massive automated scraping campaign has been causing intermittent slowdowns on a high-profile WordPress site for months, and standard analytics tools like Google Analytics never detected it. The culprit: Go-http-client/1.1, a library from the Go programming language, hammering the server with over 67,000 requests—all invisible to browser-based tracking.

"The client reported slowness for months, but every diagnostic showed normal traffic," said Alex Chen, lead engineer at SysWP, the server-monitoring firm that uncovered the attack. "We dug into raw server logs and found a flood of requests from a non-browser user agent, completely missed by Google Analytics, Plausible, and Fathom."
How the Attack Worked
The analysis revealed a single user agent—Go-http-client/1.1—accounted for 99% of all unknown traffic, generating 67,323 hits in the observation window. This library makes automated HTTP requests without a browser, meaning no JavaScript runs, so analytics snippets never fire.
Each request consumed a PHP worker, hit the database, and fully rendered WordPress. On a hosting plan with limited concurrent workers, this caused intermittent resource contention. "Sometimes the site was fine; sometimes it crawled—classic symptoms of a scraper competing with real users for server capacity," Chen explained.
Background: Why Analytics Miss the Threat
Google Analytics and similar tools rely on a JavaScript snippet that executes only in a browser environment. Non-browser requests—from bots, scrapers, or API clients—never trigger the snippet, creating a blind spot. This leaves site owners unaware of huge traffic sources that degrade performance.
Server-level monitoring, such as SysWP Radar, captures all requests at the network layer before any client-side code runs. Only there did the full picture emerge: a hidden ecosystem of malicious and suspicious traffic, including axios/1.15.0 (308 hits) and other Node.js-based scrapers.

What This Means for WordPress Site Owners
This incident proves that slow site performance can originate from traffic your analytics never sees. The risks go beyond speed: systematic scraping can exhaust PHP worker pools, trigger 503 errors, inflate hosting costs, steal content for AI training or republishing, and even degrade Core Web Vitals scores.
"If you rely solely on Google Analytics, you're flying blind to the most damaging traffic," Chen warned. "Server-level analytics are no longer optional—they're essential for understanding what's really hitting your server."
Immediate Actions to Take
- Enable server-level logging (e.g., access.log) and analyze for non-browser user agents.
- Use a web application firewall (WAF) to block known scraper libraries like Go-http-client or axios.
- Consider adding a bot detection service that works at the server level.
- Monitor PHP worker usage in real time to spot contention before users complain.
For a deeper dive, see our Background section on analytics limitations and What This Means for your site's security.
Related Articles
- Prime Video's Top 3 Satirical Superhero Series to Binge This Weekend
- How to Protect Your Browser from Malicious AI Extensions That Steal Your Data
- How to Stop AI Code Errors From Reaching Your Pull Request Review
- Top 10 MacBook Pro May Deals: Prices Slashed to $1,949 on M5 Pro & M5 Max
- JetBrains Unveils 2026 Vision: AI and Traditional Coding to Coexist in Integrated Development Environments
- 10 Powerful Features in MerMark Editor v0.2.0 You Should Know About
- Cloud AI's Hidden Cost: Convenience Premium Threatens Portfolio Scale, Experts Warn
- Mastering Python Environments in VS Code: 10 Essential Features You Need to Know