What is bot traffic?

A bot – also known as the Internet bot or web robot – is a program or a script that runs automated tasks over the Internet. Typically intended to perform simple and repetitive tasks that would be time-consuming, mundane or impossible for a human to perform. The most advanced bots are powered by artificial intelligence. As this technology is still taking shape, most bots follow a set of rules programmed by a human via a bot-building platform. While bots may be used for productive tasks, despite their neutral origins, they often come in the form of malware. Bots are foremost used by search engines like Google, Bing, Yandex or Baidu for web spidering purposes. These bots collect periodical information from hundreds of millions of domains and index it into their result pages.

Highly important for marketing purposes are chat bots, able to cover the gap between conversational marketing and customer service. As numerous studies show that customers positively respond to live chats on the websites, marketers believe that they can greatly improve one’s business. Chat bots interact with users on many different levels. A worth mentioning phenomena in the field is Xiaoice – a Chinese chat bot developed by Microsoft that has engaged with over 100 million people worldwide in more than 30 billion dialogues. Xiaoice is able to dynamically recognize emotion and engage the user throughout long conversations. For many, it is much more than just a service. A countless number of users have established an emotional connection with it and even told Xiaoice that they love it.

Good bots vs. bad bots

In general, bots can be broken down into two categories – the good bots and the bad ones. As mentioned above, good bots are created to make humans lives easier and their activities involve: web crawling, website monitoring, the content retrieving, data aggregation, online transactions and so on. The bad bots bring fake (bogus) traffic to your website and their malicious intent may involve: stealing valuable data, content/price scraping, posting spam comments and phishing links, distorting web analytics and damaging SEO, contributing to DDoS attacks etc.

Nowadays, almost 56% of bot traffic is used for the malicious purposes that we encounter on the Internet websites. Below I presented the most common types of good and bad bots, along with some examples. Keep in mind, that there is no predefined, general classification of bots and you may come across different categories – some more general, others more specific.

The good guys:

  • Crawlers / spiders (e.g. Googlebot, Yandex bot,  Bingbot) – Used by search engines and online services to discover and index website content, making it easier for internet users to find it.
  • Traders (Bitcoin trading bots) – Used by Ecommerce businesses to act like agents on behalf of humans, interacting with external systems to accomplish a specific transaction, moving data from one platform to another. Based on the given pricing criteria, they search for the best deals and then automatically buy or sell.
  • Monitoring Bots (e.g. Pingdom, Keynote) – Monitor health system of the website, evaluate its accessibility, report on page load times & downtime duration, keeping it healthy and responsive.
  • Feedfetcher / Informational bots (e.g. Pinterest bot, Twitter bot) – Collect information from different websites to keep the users or subscribers up-to-date on the news, events or blog articles. They cover different forms of content fetching, from updating weather conditions to censoring language in comments and chat rooms.
  • Chat bots (e.g. Messenger, Slack, Xiaoice) – A service that enables interacting with a user via a chat interface regarding a number of things, ranging from functional to fun.

The bad guys:

  • Impersonators – Designed to mimic human behavior to bypass the security and by following offsite commands, steal or bring down the website. This category also includes propaganda bots, used by countries to manipulate public opinion.
  • Scrapers –  Scrape and steal original content and relevant information. Often repost it on other websites. Scrapers can reverse-engineer pricing, product catalogues and business models or steal customers lists and email addresses for spam purposes.
  • Spammers – Post phishing links and low-quality promotional content to lure visitors away from the website and ultimately drive traffic to the spammer’s website. Often use malware or black hat SEO techniques that may lead to blacklisting the infected site. A specific type of spammer is auto-refresh bots, which generate fake traffic.
  • Click / download bots – intentionally interact or click on PPC and performance-based ads. Associated costs of such ads increase based on exposure to an ad – meaning the more people are reached, the more expensive they are. This form of ad fraud is rather new, but already pretty common. According to paid advertising experts, one in five paid clicks were fraudulent during the month of January 2017.

How bot traffic damages your performance?

There is a number of ways how bots can affect your webpage and your business overall performance.

1. Bots contribute to DDoS attacks – A DDoS attack (distributed denial-of-service) is a malicious attempt to make a server or a network resource unavailable to users. DDoS attacks are often performed by botnets – a group of hijacked Internet-connected devices, injected with malware. Botnets are controlled from a remote location without the knowledge of the device’s owner. A successful DDoS attack results not only in short-term loss of business but can have long-term effects on your online brand reputation, generate significant costs from hosting providers or even compromise your business.

2. Damage your SEO and website reputation – Firstly, scrapers stealing your content and illegally distributing it on other websites might degrade your SEO and outrank you on search engine listings. Secondly, if your website will get a lot of fake views generated by malicious bots, search engines will undermine its credibility. As advertising networks consider fake views as a form of fraud, you might end up with a penalized website. If the trend repeats, advertising networks could even blacklist or remove your website.

3. Bots can take over your account – Bots can hack your website, steal your data and make it available on hacker dump sites and black markets. Loss of customers sensitive information can impact brand reputation greatly and result in high costs.

4. Bots can lead to monetary loss – Besides all the threats listed above, bots can lead to direct monetary loss – Your paid ad campaigns will be more expensive and less effective because of the fraudulent clicks; Your visitors might be lured away from your site via comment spam links and poor UX; Stolen content might require high-priced legal actions. Not to mention that the server and bandwidth cost increase when bots hit the website with millions of unwanted requests within a short time frame.

How to determine bots activity?

To determine if bots are messing with your webpage, you need to dig a little into your analytics. What can indicate bots’ suspicious activity?

  • Uneven traffic – If you see an unusual increase in your page views and you haven’t recently run a big ad campaign, bots may be standing behind it.
  • Abnormally low time spent on a page & increased bounce rates – As bots are programmed to perform their tasks at high speeds, they can crawl numerous pages within a small time frame. If you see many page durations which only span a few seconds, then you might be looking at a bot activity.
  • More visits than actual customers – If you’re noticing a sudden increase in your monthly website visits, you should check where your traffic is coming from.
  • Unknown domains referring traffic to your site – If you suddenly started to see a spike in referral traffic or hordes of users checking your site directly every day, it’s probably robots.

As bots can be very dangerous for your website and your business as a whole, you will be willing to do everything to remove possible bot traffic from your website and prevent an attack. To determine the credibility of your web traffic and possible bot activity, you need to harden your site security. There are some steps you can take on your own like implementing CAPTCHAs on forms or blocking IP addresses. However, the most effective way to identify and mitigate bots is by using a specialized tool.

Summary

As bots cover more than half of the whole web traffic and most of them have malicious intents, it’s more important than ever to protect your website from bogus traffic. Over time, the negative effects of bot traffic and ad fraud can heavily influence your web strategy, leading to wasted money, sales, time and effort. Is there a chance for the industry? How do market giants deal with ad fraud threat? According to Robert Gryn, we can expect the shift from quantity to quality. But before that happens – Make sure your website is safe by keeping an eye on your analytics and implementing an effective bot management tool.


Key takeaways:

  • More than half of the web traffic is bots, but not all of them are bad;
  • Good bots are responsible for chats, gathering information, page monitoring, trading, and indexing the websites;
  • Bad bots may impersonate the human to steal or bring down the website, distribute propaganda, steal content, post spam, and click on paid campaigns;
  • The threats: skewed analytics, DDoS attacks, SEO and website reputation damage, account takeover, or money loss;
  • To find out if bots are messing with your website, you can look in the analytics for uneven traffic, abnormally low time spent on a page, the bounce rates, the origin of the traffic, and unknown referrals.

You cannot copy content of this page