sel logo
Search Engine Land » SEO »
Chat with SearchBot
SearchBot:
The SEO landscape is changing fast, and many of the tactics we rely on today are becoming obsolete. To keep driving organic traffic and real value, we need to rethink our approach.
New developments – like Google’s AI Overviews, rolled out in May – are reshaping search results, shifting us away from the old “10 blue links” model.
AI is influencing not just platforms and processes but also how users behave and interact with content.
This article covers key insights and practical strategies to help you adapt to these changes and prepare your SEO efforts for 2025 and beyond.
When you search for “what is a zap in Zapier,” you may notice that the featured snippet appears as a citation in the AI Overview.
This is not an isolated case.
Many reports highlight similarities between featured snippets and the answers provided in AI Overviews.
Some experts have even suggested that this redundancy could lead Google to eventually phase out featured snippets altogether.
There’s still value in trying to appear for featured snippets. Think of this as one way to have a higher chance of appearing in the AI Overviews.
If you appear in a featured snippet you are increasing your opportunity to appear in AI Overviews.
Now is the time to get your ducks in a row and build a process to optimize for featured snippets. This will continue to be useful, should Google get rid of the featured snippets in the future.
Dig deeper: AI SEO: How to be visible in Google AI Overviews, chatbots, LLMs
Here’s an example from searching for the same query “What is a zap in zapier.” A click on the link icon shows two URLs that were used in creating this part of the answer.
About 21.1% of queries trigger overviews with each answer containing an average of 8.9 links, per a study by Rich Sanger and Authoritas.
This means we have 8.9 links in the AI Overviews plus the organic results in SERPs.
AI Overviews aren’t here to destroy organic traffic. Instead, they are reshaping the ranking landscape and creating new opportunities in SERPs by providing more visibility.
Now is the time to shift our focus: rather than solely aiming for a spot in the top 10 results, we should also prioritize appearing in AI Overview citations.
The good news is that you don’t need to rank in the top 10 to be cited in these overviews.
There have been various studies showing the differences in the top 10 ranking results and the URLs cited in AI Overviews.
Up to 46.5% of the URLs included in AI Overviews rank outside the top 50 organic results, per an Advanced Web Ranking study.
Focus on creating high-quality content, regardless of whether you’re in the top 10 rankings.
Even if you don’t rank highly, you still have the potential to be cited in AI Overviews or by large language models (LLMs).
Low-quality content is becoming obsolete, and there’s increasing value in producing meaningful content beyond just SERP rankings.
While measuring this value may be difficult at the moment, we will soon have the tools to assess it more effectively.
Dig deeper: The art of AI-enhanced content: 8 ways to keep human creativity front and center
We hear about backlinks losing their value every year. LLMs and AI Overviews are shifting that narrative once again.
The more brand mentions you have (even without a link), the greater the likelihood that your content will appear in answers within AI Overviews and other LLMs.
Backlinks remain important, but their significance is growing even more.
Getting your brand mentioned is crucial for increasing your chances of appearing in LLMs and AI Overviews.
Dig deeper: How to harness the power of brand mentions across the search universe
Get the newsletter search marketers rely on.
See terms.
Reddit comprised about 4% of the training data for GPT-3, yet it held a significant weight of 22% in its influence.
Similarly, Wikipedia made up around 1% of the training data but has three times that weight in its impact on LLMs.
Given its current importance, it’s clear that being active on platforms like Reddit is essential for maximizing your Google visibility.
To appear in AI Overviews and other LLMs, it’s crucial not only to be mentioned or cited but also to actively establish your brand on platforms like Reddit.
Instead of simply driving traffic from Reddit to your website, consider posting relevant information directly on the platform.
When done thoughtfully and without spamming subreddits, this approach can resemble a white or gray hat strategy of parasite SEO that adds value to your business.
Additionally, maintaining a presence on Wikipedia is important – just remember to avoid spamming.
As Rand Fishkin put it, “You have to be present with native content in the zero-click platforms because that is the future.”
Dig deeper: How to win with generative engine optimization while keeping SEO top-tier
We can no longer afford to lose clicks because our titles in SERPs are not optimized for CTR.
If some of the clicks are going to be taken by AI Overviews, then we need to make every impression in SERPs count. There’s no better way to do this than optimizing your CTR.
I can’t emphasize this enough: SEO is evolving, and our strategies must adapt accordingly.
CTR tests should be a fundamental aspect of our approach, as even small adjustments in click-through rates can lead to significant results.
These improvements can be observed at various levels, including individual pages, folders and sitewide performance.
Negative SEO, which once involved building toxic backlinks to competitors’ websites using spammy anchor texts, is set to evolve.
Competitors may now attempt to manipulate AI Overviews and LLMs to present negative information about a brand.
To protect your brand from manipulation, monitor brand mentions and build a strong identity that is not easily undermined.
We need to develop effective ways to measure our brand’s health in SERPs and online.
While we currently have metrics such as the total branded searches, clicks, share of voice and interest in the brand (as indicated by Google Trends), we should also consider creating additional metrics and methods to assess our brand health comprehensively.
Dig deeper: Online reputation management: Top 10 hurdles and how to overcome them
As Michael King puts it, “Search, as we know it, has been irrevocably changed by generative AI.”
Answer bots like Perplexity and AI Overviews use retrieval-augmented generation (RAG), a technology that leverages live data to ground answers in facts and reduce hallucinations.
Just as we once learned how Google operated, we must now understand the workings of RAG technology.
By investing time in grasping how RAG functions, you can better navigate this new reality and adjust your tactics and strategies accordingly.
Many SEOs have yet to change their playbooks simply because they don’t fully understand these changes. Take the time to learn about RAG – it’s crucial for adapting to the future of SEO.
Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.
Related stories
New on Search Engine Land
About the author
Related topics
Get the newsletter search marketers rely on.
See terms.
Learn actionable search marketing tactics that can help you drive more traffic, leads, and revenue.
Online Nov. 13-14: SMX Next
Available on-demand: SMX Advanced
Available on-demand: SMX Master Classes
Discover time-saving technologies and actionable tactics that can help you overcome crucial marketing challenges.
April 15-17, 2020: San Jose
Get More Out of Your Webinars: Strategies for 24/7 Engagement
How Content is Critical to a Winning Ecommerce Strategy
ABM and AI for Marketers: Priority Use Cases for 2025
Enterprise SEO Platforms: A Marketer’s Guide
Email Marketing Platforms: A Marketer’s Guide
Customer Data Platforms: A Marketer’s Guide
Elevate Your Executive Brand with Proven Social Media Strategies
Meet your new AI-powered marketing assistant!
Get the newsletter search marketers rely on.
Topics
Our events
About
Follow us
© 2024 Search Engine Land is a Trademark of Semrush Inc.
Third Door Media, Inc. is a publisher and marketing solutions provider incorporated in Delaware, USA, with an address 88 Schoolhouse Road, P.O. Box 3103, Edgartown, MA 02539. Third Door Media operates business-to-business media properties and produces events, including SMX. It is the publisher of Search Engine Land, the leading digital publication covering the latest search engine optimization (SEO) and pay-per-click (PPC) marketing news, trends and advice.
Google Debunks The "Index Bloat" Theory – Search Engine Journal
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Discover the latest trends, tips, and strategies in SEO and PPC marketing. Our curated articles offer in-depth analysis, practical advice, and actionable insights to elevate your digital marketing efforts.
In this guide, industry experts explore the challenges and opportunities presented by the onslaught of AI tools, and provide actionable tips for thriving amid these shifts.
Discover the latest trends, tips, and strategies in SEO and PPC marketing. Our curated articles offer in-depth analysis, practical advice, and actionable insights to elevate your digital marketing efforts.
Join three of Reddit’s top executives in this exclusive AMA (Ask Me Anything) to discover how you can tap into Reddit’s unique platform to drive brand growth.
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Google’s John Mueller debunks the “Index Bloat” theory, stating there’s no limit on the number of pages indexed per site.
Google’s John Mueller debunks the “Index Bloat” theory, stating there’s no limit on the number of pages indexed per site.
In a recent episode of the ‘Search Off The Record’ podcast, Google’s Search Relations team addresses questions about webpage indexing.
A key point of discussion was the concept of “Index Bloat,”—a theory that has garnered attention within the SEO community.
Google Search Advocate John Mueller refutes the idea of index bloat, which posits that excessive indexing of unnecessary pages can negatively impact search engine rankings.
This article covers the details of the index bloat theory, Google’s response, and the broader implications for SEO practices.
The term “index bloat” describes a situation where search crawlers index pages that aren’t ideal for search results.
This includes a variety of pages, such as filtered product pages, internal search results, printer-friendly versions of pages, and more.
Proponents of the index bloat theory argue that these pages make it harder for search engines to understand websites, negatively impacting search rankings.
The theory relates to the concept of a crawl budget, the number of URLs a search bot will crawl during each visit.
The theory suggests that index bloat can lead to inefficient use of this crawl budget as search bots spend time and resources collecting unnecessary information.
Mueller debunks the index bloat theory, stating:
“I’m not aware of any concept of index bloat at Google. Our systems don’t artificially limit the number of pages indexed per site. I’d just make sure that the pages which you’re providing for indexing are actually useful pages, but that’s independent of the number of pages your site has.”
This statement challenges the fundamental premise of index bloat.
According to Mueller, Google doesn’t impose an artificial limit on pages indexed per site.
Rather than worrying about omitting pages from Google’s index, Mueller believes your time is better spent publishing helpful content.
Those who support the index bloat theory often cite causes such as accidental page duplication, incorrect robots.txt files, and poorly performing or thin content.
However, Google suggests that these aren’t causes of a non-existent “index bloat” but general SEO practices to which webmasters and SEO professionals should pay attention.
Proponents of the index bloat theory have suggested using tools like Google Search Console to detect index bloat by comparing the number of indexed pages to what is expected.
Google’s standpoint, however, implies this comparison doesn’t indicate a problem. It’s part of regular website management and monitoring.
Despite the conversations around index bloat, Google’s official stance is clear: the notion is debunked.
Instead, the focus should be on ensuring that the pages you provide for indexing are valuable and relevant.
Source: Google Search Off The Record
Featured image generated by the author using Midjourney.
Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, …
Conquer your day with daily search marketing news.
Join Our Newsletter.
Get your daily dose of search know-how.
In a world ruled by algorithms, SEJ brings timely, relevant information for SEOs, marketers, and entrepreneurs to optimize and grow their businesses — and careers.
Copyright © 2024 Search Engine Journal. All rights reserved. Published by Alpha Brand Media.
Phish ’n’ Ships: Human Security warns of fake shops exploiting payment platforms and SEO – SiliconANGLE News
by
A new report out today from cybersecurity company Human Security Inc. is warning of a large-scale phishing scheme, dubbed “Phish ‘n’ Ships,” that leverages fake online shops and search engine manipulation to defraud consumers.
Uncovered by the company’s Satori Threat Intelligence and Research team, the Phish ‘n’ Ships scheme is described as a sophisticated effort to exploit consumers by using fake web shops and compromised search engine ranks. The threat actors behind the scheme infect legitimate websites to create and rank fake product listings for popular items, making them appear in top search results. When unsuspecting consumers click on these links, they are redirected to counterfeit stores controlled by the attackers.
Once on the fake site, consumers go through what appears to be a typical online checkout process. Payment information is collected through one of several targeted payment processors, allowing the attackers to capture funds and sensitive card data. Victims believed they were purchasing real items, but the products never arrived.
The report notes that the operation has affected more than 1,000 websites and created 121 fake online stores, costing victims millions of dollars. By abusing search engine optimization tactics, the attackers drew significant traffic to the counterfeit sites, with the scheme estimated to have hit hundreds of thousands of consumers over the past five years.
While not outright saying that those behind the scheme were from mainland China, the report does state that the internal tools used by the threat actors used Simplified Chinese, the form of Chinese used in mainland China, versus traditional Chinese that is used in Hong Kong, Taiwan and Macau.
Working with payment platforms, Human Security has managed to disrupt much of the operation, including having Google remove many of the fraudulent listings from its search results and the payment processors involved having suspended the accounts associated with the scheme. Law enforcement agencies and the broader threat intelligence community have also been informed to prevent further losses.
Though the links to the scheme may have mostly been removed and its operations stunted, Phish ‘n’ Ships remains a live threat, with attackers searching for new methods to evade detection. Human Security is warning consumers to remain vigilant when shopping online, especially for deals that seem too good to be true.
THANK YOU
Gemini in Android Studio rolls out more AI-powered development features
Google brings grounding with search to Gemini in AI Studio and API
Phish ’n’ Ships: Human Security warns of fake shops exploiting payment platforms and SEO
Sluggish chip sales curtail Samsung’s profits, worrying investors
Meta’s stock heads south on slow user growth and ongoing infrastructure investments
Microsoft’s AI bet pays off as Azure revenue grows, but stock falls on infrastructure supplier delays
Gemini in Android Studio rolls out more AI-powered development features
– BY . 1 HOUR AGO
Google brings grounding with search to Gemini in AI Studio and API
– BY . 1 HOUR AGO
Phish ’n’ Ships: Human Security warns of fake shops exploiting payment platforms and SEO
– BY . 5 HOURS AGO
Sluggish chip sales curtail Samsung’s profits, worrying investors
– BY . 16 HOURS AGO
Meta’s stock heads south on slow user growth and ongoing infrastructure investments
– BY . 17 HOURS AGO
Microsoft’s AI bet pays off as Azure revenue grows, but stock falls on infrastructure supplier delays
– BY . 18 HOURS AGO
Forgot Password?
Like Free Content? Subscribe to follow.
Google Warns: URL Parameters Create Crawl Issues – Search Engine Journal
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Discover the latest trends, tips, and strategies in SEO and PPC marketing. Our curated articles offer in-depth analysis, practical advice, and actionable insights to elevate your digital marketing efforts.
In this guide, industry experts explore the challenges and opportunities presented by the onslaught of AI tools, and provide actionable tips for thriving amid these shifts.
Discover the latest trends, tips, and strategies in SEO and PPC marketing. Our curated articles offer in-depth analysis, practical advice, and actionable insights to elevate your digital marketing efforts.
Join three of Reddit’s top executives in this exclusive AMA (Ask Me Anything) to discover how you can tap into Reddit’s unique platform to drive brand growth.
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Google’s Gary Illyes warns of URL parameter issues causing crawler inefficiencies, especially for e-commerce sites.
Gary Illyes, Analyst at Google, has highlighted a major issue for crawlers: URL parameters.
During a recent episode of Google’s Search Off The Record podcast, Illyes explained how parameters can create endless URLs for a single page, causing crawl inefficiencies.
Illyes covered the technical aspects, SEO impact, and potential solutions. He also discussed Google’s past approaches and hinted at future fixes.
This info is especially relevant for large or e-commerce sites.
Illyes explained that URL parameters can create what amounts to an infinite number of URLs for a single page.
He explains:
“Technically, you can add that in one almost infinite–well, de facto infinite–number of parameters to any URL, and the server will just ignore those that don’t alter the response.”
This creates a problem for search engine crawlers.
While these variations might lead to the same content, crawlers can’t know this without visiting each URL. This can lead to inefficient use of crawl resources and indexing issues.
The problem is prevalent among e-commerce websites, which often use URL parameters to track, filter, and sort products.
For instance, a single product page might have multiple URL variations for different color options, sizes, or referral sources.
Illyes pointed out:
“Because you can just add URL parameters to it… it also means that when you are crawling, and crawling in the proper sense like ‘following links,’ then everything– everything becomes much more complicated.”
Related: Crawler Traps: Causes, Solutions & Prevention
Google has grappled with this issue for years. In the past, Google offered a URL Parameters tool in Search Console to help webmasters indicate which parameters were important and which could be ignored.
However, this tool was deprecated in 2022, leaving some SEOs concerned about how to manage this issue.
While Illyes didn’t offer a definitive solution, he hinted at potential approaches:
Related: Google Confirms 3 Ways To Make Googlebot Crawl More
This discussion has several implications for SEO:
URL parameter handling remains tricky for search engines.
Google is working on it, but you should still monitor URL structures and use tools to guide crawlers.
Hear the full discussion in the podcast episode below:
Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, …
Conquer your day with daily search marketing news.
Join Our Newsletter.
Get your daily dose of search know-how.
In a world ruled by algorithms, SEJ brings timely, relevant information for SEOs, marketers, and entrepreneurs to optimize and grow their businesses — and careers.
Copyright © 2024 Search Engine Journal. All rights reserved. Published by Alpha Brand Media.
Mykita Debuts Limited Artist Edition With Park Seo-Bo – Vision Monday
Google: A ‘Site:’ Search Doesn’t Show All Pages – Search Engine Journal
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Discover the latest trends, tips, and strategies in SEO and PPC marketing. Our curated articles offer in-depth analysis, practical advice, and actionable insights to elevate your digital marketing efforts.
In this guide, industry experts explore the challenges and opportunities presented by the onslaught of AI tools, and provide actionable tips for thriving amid these shifts.
Discover the latest trends, tips, and strategies in SEO and PPC marketing. Our curated articles offer in-depth analysis, practical advice, and actionable insights to elevate your digital marketing efforts.
Join three of Reddit’s top executives in this exclusive AMA (Ask Me Anything) to discover how you can tap into Reddit’s unique platform to drive brand growth.
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Google reminds everyone that a site: search is not meant to show all indexed pages on a site.
Google’s John Mueller says the results of a site: query are not a comprehensive collection of all a website’s pages.
This topic is addressed in the latest installment of the Ask Googlebot video series on YouTube.
Specifically, Mueller answers the following question:
“All of my URLs are indexed and when I check in Google one by one, the total number of URLs is 180. But in Google SERP, only 28 URLs show. Why is that?”
This person says all their pages can be found in Google when searching for them individually, but only 28 pages are shown in a site: search.
That may seem unusual, but as we hear from Mueller it’s perfectly normal.
Here’s his response.
Related: An SEO Guide to Advanced Google Search Operators
A site: query is a command that instructs Google to return results from one specific domain.
For example, if you only wanted to see pages from Search Engine Journal, you would type “site:searchenginejournal.com” into Google.
You can take these searches a step further by adding a keyword in front of the site: command. This will tell Google to return results from one domain that are relevant to the keyword in the query.
If you wanted to see articles from Search Engine Journal about Core Web Vitals, for example, you’d type “core web vitals site:searchenginejournal.com” into Google.
While this can be a useful tool to find what you’re looking for, it’s not designed to be used for diagnostics purposes.
Mueller explains:
“The short answer is that a site: query is not meant to be complete, nor used for diagnostics purposes.
A site query is a specific kind of search that limits the results to a certain website. It’s basically just the word site, a colon, and then the website’s domain.
This query limits the results to a specific website. It’s not meant to be a comprehensive collection of all the pages from that website.”
If you know you have 100 indexed pages, but a site: search only returns 50, there’s no reason to be concerned.
For an accurate look at how many pages of your Google is able to index, use Search Console.
The Index Coverage report in Search Console will show exactly which pages are indexed, and which pages (if any) have errors that prevent indexing.
Mueller continues:
“If you’re keen on finding out how many pages google has indexed from your website, then use Search Console instead.
Google Search Console is a free tool you can use to get more information about how Google Search sees your website.
Within Search Console you can see how your pages are shown in search, as well as the number of pages that are currently indexed.
In short, don’t worry about the counts shown in a site: query. Use Search Console instead.”
For complete details on how to use Search Console’s Index Coverage report, see this explainer:
If Google is not able to index any of your pages, this report will tell you why. Then you can fix the issue and use the same report to re-submit the URL for indexing.
See the full video below:
Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, …
Conquer your day with daily search marketing news.
Join Our Newsletter.
Get your daily dose of search know-how.
In a world ruled by algorithms, SEJ brings timely, relevant information for SEOs, marketers, and entrepreneurs to optimize and grow their businesses — and careers.
Copyright © 2024 Search Engine Journal. All rights reserved. Published by Alpha Brand Media.