Join us in analyzing 3 case studies that show the importance of driving brand search behavior and engagement, and how to do it in months, instead of years.
Maximize your SEO efforts in 2024 with insights on Google’s SGE, algorithm updates, and expert tips to keep your site ahead.
Download this guide and learn how to optimize and manage Google Performance Max campaigns, with expert insights and actionable strategies to ensure your campaigns are effective.
Join us in analyzing 3 case studies that show the importance of driving brand search behavior and engagement, and how to do it in months, instead of years.
Join us in analyzing 3 case studies that show the importance of driving brand search behavior and engagement, and how to do it in months, instead of years.
Join us as we dive into exclusive survey data from industry-leading SEOs, digital marketers, content marketers, and more to highlight the top priorities and challenges that will shape the future of search in 2025.
Google’s Martin Splitt shared how SEOs and site owners can fight back against malicious bots and improve site performance
Google’s Martin Splitt answered a question about malicious bots that impact site performance, offering suggestions every SEO and site owner should know and put into action.
Many SEOs who do site audits commonly overlook security and bot traffic as part of their audits because it’s not widely understood by digital marketers that security events impact site performance and can account for why a site is inadequately crawled. Improving core web vitals will do nothing to improve site performance when a poor security posture is contributing to poor site performance.
Every website is under attack and the effects of excessive crawling can trigger a “500 server error” response code, signaling an inability to serve web pages and hindering Google’s ability to crawl web pages.
The person asking the question wanted Google’s advice on how to fight back against the waves of scraper bots impacting their server performance.
This is the question asked:
“Our website is experiencing significant disruptions due to targeted scraping by automated software, leading to performance issues, increased server load, and potential data security concerns. Despite IP blocking and other preventive measures, the problem persists. What can we do?”
Google’s Martin Splitt suggested identifying the service that is serving as the source of the attacks and notifying them of an abusive use of their services. He also recommended the firewall capabilities of a CDN (Content Delivery Network).
Martin answered:
“This sounds like somewhat of a distributed denial-of-service issue if the crawling is so aggressive that it causes performance degradation.
You can try identifying the owner of the network where the traffic is coming from, thank “their hoster” and send an abuse notification. You can use WHOIS information for that, usually.
Alternatively, CDNs often have features to detect bot traffic and block it and by definition they take the traffic away from your server and distribute it nicely, so that’s a win. Most CDNs recognize legitimate search engine bots and won’t block them but if that’s a major concern for you, consider asking them before starting to use them.”
Identifying the cloud provider or server data center that’s hosting the malicious bots is good advice. But there are many scenarios where that won’t work.
Bots often use VPNs and open source “Tor” networks that hide the source of the bots, defeating all attempts of identifying the cloud services or web host providing the infrastructure for the bots. Hackers also hide behind compromised home and business computers, called botnets to launch their attacks. There’s no way to identify them.
Some bots respond to IP blocking by instantly switching to a different network to immediately resume their attack. An attack can originate from a German server and when blocked will switch to a network provider in Asia.
Contacting network providers about abusive users is futile when the source of the traffic is obfuscated or from hundreds of sources. Many site owners and SEOs might be surprised to discover how intensive the attacks on their websites are. Even taking action against a small group of offenders is an inefficient use of time because there are literally millions of other bots that will replace the ones blocked by a cloud provider.
And what about botnets made up of thousands of compromised computers around the world? Think you have time to notify all of those ISPs?
Those are three reasons why notifying infrastructure providers is not a viable approach to stopping bots that impact site performance. Realistically, it’s a futile and inefficient use of time.
Using a Web Application Firewall (WAF) is a good idea and that’s the function that Martin Splitt suggests when he mentioned using a CDN (content delivery network). A CDN, like Cloudflare, sends browsers and crawlers the requested web page from a server that’s located closest to them, speeding up site performance and reducing server resources for the site owner.
A CDN also has a WAF (Web Application Firewall) which automatically blocks malicious bots. Martin’s suggestion for using a CDN is definitely a good option, especially because it has the additional benefit of improving site performance.
An option that Martin didn’t mention is to use a WordPress plugin WAF like Wordfence. Wordfence has a WAF that automatically shuts down bots based on their behavior. For example, if a bot is requesting ridiculous amounts of pages it will automatically create a temporary IP block. If the bot rotates to another IP address it will identify the crawling behavior and block it again.
Another solution to consider is a SaaS platform like Sucuri that offers a WAF and a CDN to speed up performance. Both Wordfence and Sucuri are trustworthy providers of WordPress security and they come with limited but effective free versions.
See also: WordPress Security: 16 Steps to Secure & Protect Your Site
Listen to the question and answer at the 6:36 minute mark of the Google SEO Office Hours podcast:
Featured Image by Shutterstock/Krakenimages.com
I have 25 years hands-on experience in SEO, evolving along with the search engines by keeping up with the latest …
Conquer your day with daily search marketing news.
Join Our Newsletter.
Get your daily dose of search know-how.
In a world ruled by algorithms, SEJ brings timely, relevant information for SEOs, marketers, and entrepreneurs to optimize and grow their businesses — and careers.
Copyright © 2024 Search Engine Journal. All rights reserved. Published by Alpha Brand Media.