Leigh Beadon at Techdirt: web sites’ Traffic Is Fake, Audience Numbers Are Garbage, And Nobody Knows How Many People See Anything.
Where should we start? How about this: internet traffic is half-fake and everyone’s known it for years, but there’s no incentive to actually acknowledge it. The situation is technically improving: 2015 was hailed (quietly, among people who aren’t in charge of selling advertising) as a banner year because humans took back the majority with a stunning 51.5% share of online traffic, so hurray for that I guess. All the analytics suites, the ad networks and the tracking pixels can try as they might to filter the rest out, and there’s plenty of advice on the endless Sisyphean task of helping them do so, but considering at least half of all that bot traffic comes from bots that fall into the “malicious” or at least “unauthorized” category, and thus have every incentive to subvert the mostly-voluntary systems that are our first line of defence against bots… Well, good luck. We already know that Alexa rankings are garbage, but what does this say about even the internal numbers that sites use to sell ad space? Could they even be off by a factor of 10? I don’t know, and neither do you. Hell, we don’t even know how accurate the 51.5% figure is — it could be way off… in either direction.
There’s a sci-fi story waiting to be written about a world where the humans have all killed one another off, but the machines keep the internet running because their purpose in life is to index and spam everything.