The infrastructure that holds the internet provides access to about 3 Billion users. This makes it sound absurd to announce that, when it comes to website hits, billions of humans are beaten by another, larger, community.
Bots are automated agents that act like a human web surfer. In their beneficial forms, they may act as scouts for search engines, performance test dummy users, RSS feed generators and others.
Unfortunately, not all bots exist to serve mankind in our pursuit of progress. For every beneficial non-human visitor, another is spawned to wreak havoc.
Impersonators are bots designed to circumvent rudimentary humanity checks by imitating human-like behavior (random “think” times, thematic browsing flow, single-tasking each IP, etc)..
You may be asking yourself, “didn’t we solve this bot issue with Captcha?”. As a matter of fact (and ironically enough), bots have become better at solving captcha than humans.
Websites that aren’t protected well are left in the lurch for the bot’s master, who might intend to attack the website using a DDoS or perhaps scrape original content for unauthorized use on his own website, probe for security holes or use the website’s user-editable fields for advertising. Another type of damage is when bots are used to fake interest such as likes, views, clicks in an online item, effectively robbing money from the advertised business and destroying the reputation of the medium.
Such illegitimate designs have been found to target sites both large and small, demonstrating a steady improvement of the technology and malicious tactics employed by the bot masters. The research by Incapsula is based on a good sized sample. 20,000 websites have committed their traffic information, consisting of 15 billion visits sampled during 90 days and from all of the countries in the world.
Aside from the big picture which is useful for a general discussion, the breakdown by enterprise size makes it relevant to entrepreneurs of any caliber. It can be seen that small and medium websites give access to more automated visits than bigger websites. This can be explained by the one-time nature of some bots that enter the site once, compared to real visitors, who would visit a website several times or on a regular basis. In that way, low-traffic websites have comparatively more of the “visit the sites once” entries in their logs. Thus, the waste of bandwidth resulting from bot traffic is more crucially felt on small and medium online operations.
Although bandwidth cost is a real threat to profitability, the risk of malicious activity is a true nightmare to site owners. Impersonator bots probe the web pages for weaknesses, later to be abused by a DDoS attack or a hack attempt. They represent 75% of the bad bot traffic. Another 10% is attributed to straight hacking tools – such that do not attempt to hide but rather go for the jugular directly. A roughly equal amount are the scrapers – why invest in content when you can steal it from others? The rest is a small but noticeable activity of spam bots that post public comments and other interactive data on open websites on behalf of their advertisers.
The response by security specialists is to enhance the capabilities of activity profilers, relying on more sophistication derived by alternative intelligence and behavioral psychology rather than signature analysis or challenges. The better solutions also offer an IP database that holds reputation for different users and can decide (by looking at their interests and activity) whether they are human or robotic.
About the Author:
Debbie Fletcher is an enthusiastic, experienced writer who has written for a range of different magazines and news publications over the years.