A Slow Website = Cart Abandonment
Poor website performance is one of the worst things that can happen to an eCommerce business. Fortunately, in most cases there’s a simple way to speed up the site’s performance, provided its basic settings are configured correctly.
A slow website is one of the more obvious technological risks for an online business, and poor performance is the #1 reason visitors leave a site. If you’ve noticed some of your users have been complaining about slow website performance, then it’s very likely that you’re already missing out on a significant proportion of customers and revenue.
There are two main culprits when it comes to slow website performance: misconfiguration and overload. In most cases, misconfiguration requires special access to infrastructure and can only be addressed internally. For this reason, we’ll be taking a closer look at overload in this article.
Today, a significant portion of web traffic is generated by bots. Huge numbers of subtle and functional bots crawl your website for content, scan it for potential vulnerabilities, and try to guess passwords for user accounts. These kinds of bots bring no value to your business, and can even be harmful to your site. They create unnecessary additional traffic and server load.
By simply blocking bots, your site will run faster on the same hardware.
When working with a website, there is no need to make gut decisions. This is that rare case in life when almost everything can be counted and measured.
Now, all you need to do is measure your incoming traffic and determine the total traffic share of each of the main categories of visitors: humans, “good” bots (e.g. search engines), and “bad” bots like content crawlers and different hacker automation tools.
Here are the steps:
1. Register with botguard.net. It’s a free service while used for traffic analysis, and you won’t be asked to provide credit card details.
2. Integrate the BotGuard service in monitoring mode. For this you will need to install a Сontent Management System plugin or a Web server extension module. To find out what integration method is right for you, click here. The process is quite simple and only takes 5 – 10 minutes.
3. Once monitoring mode is activated, we advise that you wait until the number of processed requests reaches at least 5,000 – 10,000. You can view these numbers in the dashboard.
4. Next, define the categories that you would like to block: we recommend blocking scrapers, human mimicking bots, suspicious visitors, and security events – but these are just guidelines and the choice is up to you. Note that you are not actually blocking these visitors yet, but only running a hypothetical scenario to understand how much the traffic would decrease if you decided to block them.
5. Now you’re ready to calculate how much incoming traffic will decrease if you block these categories of visitors! First, add up the total traffic share of each of the categories you intend to block to get a % of how much the total traffic reaching your site will decrease. Then, to get a very rough idea of how much the load on the website will drop, you can multiply this traffic share by two.
Bot traffic usually creates a disproportionate load on the website compared to human traffic. Typical human visitors just look at several pages that interest them, while a typical bot creates a significantly higher load on particularly sensitive parts of an eCommerce site (for instance), while crawling thousands of products. While sufficient for a rough calculation, the recommended method (our recipe) will only give you insight into the minimum level of possible load reduction.
Now you should have a clearer picture of what you can gain by blocking the unwanted traffic. And all without having to spend a penny!
If you have any thoughts or desire to discuss this topic, please write to me at danica (at) botguard.ee.