How Bots can Impact your Site

When it comes to optimizing our sites, there are many factors we need to monitor and adjust to ensure that our users have the best experience possible. One factor that can heavily impact our sites performance and user experience is bot traffic on our pages.

What are bots?

Bots are programs that are automated to complete specific tasks. A bot could be programmed to crawl a website, make a social media post, block ads, or so forth. Overall, bots are tools that can be used to quickly accomplish repetitive tasks on the internet.  More often than not, these bots interact with everything on the internet, including our sites.

There are good and bad reasons a bot could interact with our site. For example, Google uses bots to crawl and index our sites, ranking our sites and making them visible in Google search queries. Google’s bots are important for getting our site out there for future customers. Beyond searchability, bots can help create better users experiences as well through blocking ads, monitoring your site for analytics and issues, automatically posting content, and many other tasks.

However, there are also negative reasons a bot can visit a site. Stealing content, serving spam, denying service, and other malicious acts have been done through the use of bots.

What are some impacts of Bots?

When it comes to bots, there are multiple ways they can impact our sites, for better or for worse.

Performance

Bot traffic can slow down our pages while they interact with our site, negatively impacting other users. Since slow user experience can cause users to leave a site, bots can be programmed to interact with your site for the purpose of denying customers easy access to your site. Even good bots, such as crawler and indexer bots, can cause the performance of our site to decrease.

Reputation

Depending on what a bot is doing on your site, your site’s reputation can be impacted. Bots that share quality content or answer user questions with ease, such as chatbots, can improve your users experience and thus your reputation. However, on the other hand, bots can be used to spread spam onto your website, such as bad reviews or malicious links. While these bots are fake, they mimic the human language well enough that the content they share can give your site a bad reputation.

Security

While bots are not inherently a threat to security, they can be used to steal user information as well as to attack our sites. Bots can be programmed to collect sensitive user information as a customer fills out a form on your site as well as share phishing links on your site. They can also be programmed to make your site not function properly, attacking your sites ability to serve customers through a denial of service attack.

Analytics

Since bots can visit your site, the data your analytics software collects can be skewed by bot interaction. Whether it be intentional, such as through a denial of service attack, or accidental as bots crawl your site, your traffic data may have weird anomalies. These anomalies can make it hard to how well your site is really doing.

How do I protect my site from bots?

Since there are both good and bad bots, we need to be able to let the good bots in while keeping the bad bots out. There are a few steps we can use to identify bad bots and limit the negative impacts good bots can have.

To start, we should watch our site data carefully. If you are using Google Analytics, for example, watch for anomalies in your traffic. Is there a sudden spike in traffic for no apparent reason? Is there a page on your site that has an unusual number of visitors, but not real user interactions? Is there an increase in failed logins? Anomalies like these can be a sign that a bot is negatively impact your site, and possibly for malicious reason.  

Following anomalies in our analytics, we should block any identified bad bots from our site. We can do this through blocking its IP address. Since bots are visiting our sites through the internet, they must have an IP address to even start. Since blocking bots essentially by hand can be inefficient and at times impossible, we can also use an IP blocklist to preemptively block addresses that we consider suspicious or dangerous.

Since even good bots can impact our site by putting a strain on our pages, it’s important to make sure that we optimize our robot.txt files. Robot.txt files are a way we can communicate and direct crawler bots, such as Googles indexing bots, on how they should traverse our website. Here we can establish rules on where these bots can go, what they can click, and so forth. By having an optimized robot.txt file, we can limit the performance impact these bots can have on our site.

Want to Learn More?

Bot traffic is just one of many factors we need to consider when optimizing our site. Check out our blog for more tips and guides on how we can make our sites better! If you want more hands-on guidance for improving your website,join the waitlist for Carrie Saunders’ upcoming course, “The Converting Website.” In this course, she will dive into a variety of important factors that aim to optimize your website.