Advertisement
November 6, 2024 | 11 min read
Listen to article 4 min
Search engine optimization (SEO) just got more difficult, but where there’s change, there’s opportunity. Comms strategist Aidan Muller talks us through how AI may throw a spanner into the works for brands.
Chessboard on a battlefield
Generative AI is fast being integrated into search (Google, Bing), and in some cases will replace search altogether. And since around two-thirds of online experiences reportedly start with a search, according to a 2019 report, there will be great rewards for those companies and organizations whose ideas, products and services are featured in those AI results.
Just as we did for search with SEO, we are starting to see a new professional industry emerge for the optimization of assets in order to shape AI results. However, unlike search, generative AI is expected to provide a synthesis and, as such, will be significantly more competitive than search. As the stakes inevitably get higher, I expect this will become a significant battleground for brands.
Some have called it AI Optimization (AIO), others have called it Generative Engine Optimization (GEO) – but I tend to refer to it as AI Results Optimization (AIRO) to distinguish it from optimization of the AI models themselves. Time will tell which acronym sticks!
Influencing AI results requires understanding how generative engines work. While we don’t all need to become AI engineers, it’s worth understanding the mechanics so we’re talking the same language.
Basically, there are three levels at which influence can be exerted:
The training data
The algorithm
Reinforcement learning from human feedback (RLHF)
Influencing the AI’s training data
Large Language Models (LLMs) are generally trained on large collections of texts, called corpora (e.g. Common Crawl, C4, BooksCorpus). Since these are aggregates, they are quite hard to influence directly. Marketers are better off thinking about the largest single data sources. Wikipedia, for example, is one of the largest datasets. As are GitHub, ArXiv, Quora, or Reddit.
The good news is that the recent(ish) implementation of retrieval-augmented generation (R.A.G.) – which allows AIs to fetch up-to-date information from a search engine or another data source – has made it easier and quicker to influence AI results. The bad news is that I expect AI models will become more discerning in time.
There have been interesting experiments, for example, to try and game AI results. Kevin Roose mentioned two of them in a recent New York Times piece: furnishing the data source with “strategic text sequences” or invisible keywords in white text (previously known as ‘keyword stuffing’).
We will hear of many more hacks of this type in the coming year. But much like the black-hat techniques that tried to game search algorithms in the early years of SEO, these will eventually be phased out after a few algorithm updates. And if the Google experience is anything to go by, we may even see the deployers of black-hat techniques penalized.
For the time being, the most responsible and coherent avenues to shape training data are drawn from best practice in content development, SEO, web development and traditional PR, with the addition of a new focus on large data-rich platforms.
Comms directors should be prioritizing:
Creating good quality content on owned properties (websites, microsites, social media to a lesser extent)
Try to answer concrete questions in a helpful and sourced way
Making sure your online properties are crawlable and your data is structured
Cultivating your domain’s authority to make sure your content gets found
Getting credible, authoritative news outlets and publishers to say nice things about your brand. (Although many of these platforms will have blocked AI access, I expect they will eventually reach commercial agreements with the key models.)
Work with a specialist to help you optimize relevant content or conversations on specialized data-rich platforms (e.g. Wikipedia, GitHub, ArXiv, Reddit).
These may take longer and be harder than black-hat techniques, but they are the most ethical and least damaging to your reputation.
As we saw earlier, we can also look to influence the algorithm or the reinforced learning process (RLHF). These processes, however, operate at company level – who will probably not take kindly to external forces looking to shape their product.
This option is an even heavier lift than shaping the training data, but it will be particularly attractive where the stakes are high, for organizations who don’t want to leave anything to chance. The variety of competing interests mean this is likely to become a fierce battleground for brands, products and ideas.
I foresee that these organizations will do this in a few different ways.
Commercial arrangements with the AI company/ies will undoubtedly play a big part. There will almost certainly be a space for a new advertising model, and there may be more specific sponsorship agreements (to feature one product or message rather than another).
But there will also be room to influence the rules of the game. Larger brands may want to shape the policy or regulatory framework, in a way that favors their products, services or ideas. In many ways, this battle is already underway with the debate around safety.
These activities are more likely to influence generative AI results in the negative than in the positive. In other words, they might not lead to the AI promoting your particular healthy snack brand, however they may eventually downgrade high-sugar alternatives as a matter of policy. They may not feature your specific hybrid car brand, but they might promote hybrid cars over inefficient petrol-powered cars.
Intervention in this area would be no different from the bans on advertising tobacco or alcohol in some countries or the watershed on TV content and advertising.
While AI will not completely replace search – some users will still want to see the source material – there is no doubt that AI results will replace a significant share of searches, especially where the output is synthesized information.
The process of influencing is hard-coded into our DNA. The shift from a handful of above-the-fold search results to a single AI result will make for a more competitive – and possibly more adversarial – environment.
In the short term, there will be a significant competitive advantage for the organizations at the top of AI results and for the professionals who master AIRO. As an industry around influencing AI results develops and professionalizes – and standards get defined – the stakes are likely to be raised, and organizations will go to ever greater lengths to influence them.
For citizens and consumers, the breadth of results may – at least initially – reduce, though this might favor newcomers. AI-generated results will increasingly be weighted towards the product, brand or idea with the highest bid. In some instances results will be manipulated by black-hat marketing professionals, unscrupulous political campaigners and ill-intentioned international actors.
We will need mechanisms to ensure trust and transparency (in the same way that search and social media have to signpost ad content). And we will need to become more discerning and even more distrustful of online information (with unknown societal consequences in the long run).
This is a call for professionals to understand the mechanics of AIRO and become better equipped. It’s inevitable that this will become an arms race. The onus is on us to do this ethically.
Aidan Muller is the director of Daimon Communications, and co-founder of the Appraise Network. Read more from The Drum opinion here.
Marketing can change the world.
© Carnyx Group Ltd 2024 | The Drum is a Registered Trademark and property of Carnyx Group Limited. All rights reserved.
Google's John Mueller on Structured Data, Speed, Disavows, Legacy Penalties & Much More [PODCAST] – Search Engine Journal
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Maximize your SEO efforts in 2024 with insights on Google’s SGE, algorithm updates, and expert tips to keep your site ahead.
Download this guide and learn how to optimize and manage Google Performance Max campaigns, with expert insights and actionable strategies to ensure your campaigns are effective.
Discover the latest trends, tips, and strategies in SEO and PPC marketing. Our curated articles offer in-depth analysis, practical advice, and actionable insights to elevate your digital marketing efforts.
Join three of Reddit’s top executives in this exclusive AMA (Ask Me Anything) to discover how you can tap into Reddit’s unique platform to drive brand growth.
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Google’s John Mueller also talks about technical SEO, content, communicating with SEOs, and why people should feel free to ask “dumb” questions.
Podcast: Download
Subscribe: Apple Podcast Google Podcasts Spotify
“Not that there are no myths out there, but I think a lot of these things, they stick around for a reason and it’s important that people feel free to ask something that might be a dumb question. And if you never ask a dumb question, then you never learn what the actual answer is, so it’s not that I want people to stop asking this kind of question, it’s important that they ask things that are confusing to them.
And even if that’s something that we hear repeatedly that is a myth or that’s based on an assumption that’s just not true, I think it’s important that people feel the freedom to be able to ask all of these things.”
Got a question about BERT, E-A-T, schema, or anything Google search-related?
John Mueller might’ve already answered it for you on either Twitter, Reddit, a Google Webmaster Hangout or his #AskGoogleWebmasters video series.
John has been doing a great job of connecting webmasters and SEO professionals to the engineers within Google, always trying to improve things for search.
While he’s probably been asked more SEO questions than anyone else, John doesn’t seem to get tired of answering them. Instead, he encourages people to keep asking questions.
He’s incredibly giving of his time and knowledge to help people solve their website and SEO issues – and he deserves to be known more for his significant contributions to the search industry.
For today’s edition of The Search Engine Journal Show, I interviewed John Mueller about his career, structured data, the importance of speed, disavowing links, dealing with legacy penalties, and so much more.
John Mueller is the Senior Webmaster Trends Analyst at Google. He’s been working at Google since September of 2007.
Before joining Google, he owned a software company in Switzerland for more than 12 years.
At the time, he created a site maps generator (shortly after sitemaps were introduced). He then started being active in the original help forums from Google trying to figure out how search engines work.
And although it was hard for him to let go of the company he built, he decided to take a risk and seize the opportunity to work at Google.
Listen to this episode and learn more about Mueller’s thoughts on communicating with SEOs, how machine learning will impact search in the coming years, a different approach in solving SEO problems, and more.
How to connect with John Mueller:
Twitter | Reddit
Visit our podcast archive to listen to other Search Engine Journal Show podcasts!
Image Credits
Featured Image: Paulo Bobita
Danny Goodwin is the former Executive Editor of Search Engine Journal. He formerly was managing editor of Momentology and editor …
Conquer your day with daily search marketing news.
Join Our Newsletter.
Get your daily dose of search know-how.
In a world ruled by algorithms, SEJ brings timely, relevant information for SEOs, marketers, and entrepreneurs to optimize and grow their businesses — and careers.
Copyright © 2024 Search Engine Journal. All rights reserved. Published by Alpha Brand Media.
Want to improve rankings and traffic? Stop blindly following SEO tool recommendations – Search Engine Land
sel logo
Search Engine Land » SEO »
Chat with SearchBot
SearchBot:
SEO tools can be invaluable for optimizing your site – but if you blindly follow every recommendation they spit out, you may be doing more harm than good.
Let’s explore the biggest pitfalls of SEO tools and how to use them to genuinely benefit your site.
SEO tools are a double-edged sword for anyone involved in content creation or digital marketing.
On the one hand, they offer valuable insights that can guide your strategy, from keyword opportunities to technical optimizations. On the other hand, blindly following their recommendations can lead to serious problems.
Overoptimized content, cosmetic reporting metrics and incorrect technical advice are just some pitfalls of overreliance on SEO tools.
Worse yet, when site owners mistakenly try to optimize for these tool-specific metrics. This is something Google’s John Mueller specifically commented on recently when urging bloggers not to take shortcuts with their SEO:
I’ve worked with thousands of sites and have seen firsthand the damage that can be done when SEO tools are misused. My goal is to prevent that same damage from befalling you!
This article details some of the worst recommendations from these tools based on my own experience – recommendations that not only contradict SEO best practices but can also harm your site’s performance.
The discussion will cover more than just popular tool deficiencies. We’ll also explore how to use these tools correctly, making them a complement to your overall strategy rather than a crutch.
Finally, I’ll break down the common traps to avoid – like over-relying on automated suggestions or using data without proper context – so you can stay clear of the issues that often derail SEO efforts.
By the end, you’ll have a clear understanding of how to get the most out of your SEO tools without falling victim to their limitations.
Without fail, I receive at least one panicked email a week from a blogger reporting a traffic drop. The conversation usually goes something like this:
This is a common response. I’ve gotten the same email from both novice and experienced bloggers.
The issue is one of education. Visibility tools, in general, are horribly unreliable.
These tools track a subset of keyword rankings as an aggregate, using best-guess traffic volume numbers, third-party clickstream data and their own proprietary algorithms.
The result: these tools tend to conflate all keyword rankings into one visibility number!
That’s a problem if you suddenly lose a ton of keywords in, for example, positions 50-100, which lowers the overall visibility number for the entire domain.
It’s likely those 50-100+ position keywords were not sending quality traffic in the first place. But because the blogger lost them, the visibility index has decreased, and boom, it looks like they suffered a noticeable traffic drop!
Plenty of visibility tools and metrics exist in the SEO space, and many have value. They can and should be deployed quickly to pinpoint where actual SEO research should come into play when diagnosing problems.
But as SEOs, we educate clients that these same tools should never be the final authority on matters as important as traffic drops or troubleshooting possible SEO issues.
When forming solid hypotheses and recommended action items, always prioritize first-party data in Google Analytics, Google Search Console, etc.
It’s not just these “visibility metrics” that give tools a bad name.
Many of the most popular tools available in the niche provide outdated metrics that have been debunked as a waste of time for SEO priority purposes.
One of those metrics is the popular text-to-HTML ratio metric.
Briefly defined, the metric compares the amount of text on the page to the HTML code required to display it.
This is usually expressed as a percentage, with a “higher” percentage being preferred, as that signifies more text in relation to the code.
Even though this has been repeatedly denied as a ranking factor this is still a reported audit finding on most crawling programs and popular SEO tool suites.
The same can also be said when discussing the topic of toxic links and disavow files.
Yet, Google has publicly communicated multiple times that toxic links are great for selling tools and that you would be wise to ignore such reports as they do nothing for you.
I can only speak to my experience, but I’ve only ever improved sites by removing disavow files.
Unless you actually have a links-based manual penalty that requires you to disavow links (you shouldn’t have gotten them in the first place), you should stay away from these files as well.
Get the newsletter search marketers rely on.
See terms.
Finally, another great “tool recommendation” to ignore is the purposeful non-pagination of comments.
One of the simplest ways to increase page speed, reduce DOM nodes and improve a page’s bottom-line UX is to paginate comments.
For years, the most popular SEO plugin on the planet, Yoast, provided a Site Health Warning that discouraged users from paginating comments.
Fortunately, after much back-and-forth on Github, this was resolved. You’ll still find this recommendation on many auditing tools and SEO plugins even though it’s against Google’s own pagination best practices.
It’s important to understand that the best tools have moved beyond antiquated lexical models like keyword density, word count, TF-IDF and basically “words” in general.
Semantic search has been the order of the day for years, and you should invest in tools that offer actionable insights through information retrieval science and natural language processing.
Think entities, tokens and vectors over keywords and strings. That’s the recipe for tool success.
Using SEO tools can be a powerful part of your strategy, but it’s essential to use them wisely.
While they provide an incredible range of data, a tool’s recommendations aren’t always tailored to your specific goals, audience or site context.
Let’s look at some best practices for using SEO tools effectively, ensuring they serve your strategy rather than controlling it.
SEO tools work off their own metrics and internal algorithms, providing data points that can help guide strategy.
However, they lack the human understanding of what makes content genuinely valuable to readers.
When a tool suggests adding more keywords, for instance, think twice before keyword-stuffing – it may boost certain metrics, but it often sacrifices user experience.
Every piece of tool data should be taken as a starting point, not a final directive.
Relying on just one SEO tool can lead to a narrow or skewed view of your site’s performance.
Each tool has unique metrics and algorithms that emphasize different aspects of SEO, so combining insights from platforms like Google Search Console, Semrush and Ahrefs gives you a broader understanding.
Cross-referencing can provide a more balanced perspective, helping you make better-informed decisions.
Many SEO tools focus heavily on technical metrics – heading structure, backlinks or schema deployment, for example.
While important, these shouldn’t overshadow your focus on quality content. A content-first approach remains at the heart of effective SEO.
Tools can help refine and enhance, but content that’s useful and engaging for your audience is what ultimately drives long-term success.
SEO is constantly changing, with Google’s algorithm updates reshaping best practices regularly.
Revisit and adjust your strategies to keep them aligned with the latest insights.
Tools also frequently update their metrics and algorithms, so it’s wise to monitor new features or recommendations that may add fresh value to your approach.
SEO tools sometimes emphasize optimizations that may work well for search engines but less for real users.
For example, a tool may recommend pop-ups to capture leads, but if they interfere with usability, they can lead to high bounce rates and lower overall revenue.
Always put user experience at the forefront, focusing on aspects like site speed, mobile responsiveness and accessibility.
As I state regularly in audits, SEO is all about the little things.
For most sites, it’s never one issue identified by a tool that will control your future fortunes. It’s more of a death-by-a-thousand-cuts situation, causing sites to underperform.
Tools can provide insights, allowing you to best triage your site in these situations. But they should never be followed blindly. Unfortunately, many users (and SEOs) do just that!
In the end, SEO tools are best used when the user approaches them as “aids” rather than “solutions.”
Focus on weighing all tool recommendations to genuinely benefit your site audience, and the end result will always be a solid foundation on which to propel long-term growth.
Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.
Related stories
New on Search Engine Land
About the author
Related topics
Get the newsletter search marketers rely on.
See terms.
Learn actionable search marketing tactics that can help you drive more traffic, leads, and revenue.
Online Nov. 13-14: SMX Next
Available on-demand: SMX Advanced
Available on-demand: SMX Master Classes
Discover time-saving technologies and actionable tactics that can help you overcome crucial marketing challenges.
April 15-17, 2020: San Jose
Get More Out of Your Webinars: Strategies for 24/7 Engagement
How Content is Critical to a Winning Ecommerce Strategy
ABM and AI for Marketers: Priority Use Cases for 2025
Enterprise SEO Platforms: A Marketer’s Guide
Email Marketing Platforms: A Marketer’s Guide
Customer Data Platforms: A Marketer’s Guide
Elevate Your Executive Brand with Proven Social Media Strategies
Meet your new AI-powered marketing assistant!
Get the newsletter search marketers rely on.
Topics
Our events
About
Follow us
© 2024 Search Engine Land is a Trademark of Semrush Inc.
Third Door Media, Inc. is a publisher and marketing solutions provider incorporated in Delaware, USA, with an address 88 Schoolhouse Road, P.O. Box 3103, Edgartown, MA 02539. Third Door Media operates business-to-business media properties and produces events, including SMX. It is the publisher of Search Engine Land, the leading digital publication covering the latest search engine optimization (SEO) and pay-per-click (PPC) marketing news, trends and advice.
Authority Solutions® Co-Owner Mitchell From Shares Insights on Brand Protection at World of Search 2024 – GlobeNewswire
| Source: Authority Solutions®
Manila, Philippines, Nov. 06, 2024 (GLOBE NEWSWIRE) —
Authority Solutions®, a leader in digital marketing and SEO services, proudly participated in the World of Search Conference 2024, held at the SMX Convention Center in Manila from September 26 to 29, 2024. The event attracted over 500 attendees, including top SEO professionals, business leaders, and entrepreneurs. Mitchell From, co-owner of Authority Solutions®, took the stage as a speaker to discuss the timely topic of Protecting Your Agency from Brand Infringement.
Mitchell From’s presentation focused on the growing threat of brand infringement in the digital space and shared practical strategies agencies can implement to protect their brand identity and intellectual property. His talk offered business owners valuable insights on securing their online presence in an increasingly competitive market.
“We are honored to be part of this prestigious conference,” said Mitchell From. “We wanted to help agencies understand brand infringement risks and provide them with actionable steps to protect their reputation. Although we focus on SEO and digital growth for our clients, protecting the integrity of a brand’s digital presence is key. We wanted to offer actionable insights that agencies can implement to ensure they thrive in this space.”
The World of Search Conference 2024 offered a platform for industry leaders to share the latest advancements in SEO, digital marketing tactics, and innovative tools. Attendees had the opportunity to network, exchange knowledge, and explore new strategies to stay competitive in the search marketing world.
About Authority Solutions®
Authority Solutions® is a leader in SEO based in Houston, dedicated to helping businesses enhance their online presence and achieve higher search rankings. With a strong focus on delivering measurable results, Authority Solutions® tailors strategies to meet the unique needs of its clients, ensuring sustainable growth in a highly competitive online marketplace.
Source: https://thenewsfront.com/authority-solutions-co-owner-mitchell-from-shares-insights-on-brand-protection-at-world-of-search-2024/
Legal SEO & Content Tips, Unlearning Bad Habits & More with Alex Valencia [PODCAST] – Search Engine Journal
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Maximize your SEO efforts in 2024 with insights on Google’s SGE, algorithm updates, and expert tips to keep your site ahead.
Download this guide and learn how to optimize and manage Google Performance Max campaigns, with expert insights and actionable strategies to ensure your campaigns are effective.
Discover the latest trends, tips, and strategies in SEO and PPC marketing. Our curated articles offer in-depth analysis, practical advice, and actionable insights to elevate your digital marketing efforts.
Join three of Reddit’s top executives in this exclusive AMA (Ask Me Anything) to discover how you can tap into Reddit’s unique platform to drive brand growth.
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Alex Valencia talks about the art of doing webinars, building relationships, why just writing blog posts isn’t always a smart idea for law firms, and more.
Podcast: Download
Subscribe: Apple Podcast Google Podcasts Spotify
“Sometimes we’re put in bad positions to help us grow. The metaphor in jiu-jitsu was when I was training a lot more, you’re always put in a position that if you’re not trained, it’s hard for you to get out of it, but if you kind of sit back, breathe, you’re going to find a way out.”
As early as the 2000s, Alex Valencia was already doing sales and marketing in the banking industry.
But in 2008, the market crashed and he went from being an executive making all this money to not making anything.
“It was like a punch to the belly,” as Alex described what he went through at the time.
Yet, this didn’t stop him from persevering. He decided to start a business with his wife – a content firm for law firms – hoping to get back to where he was.
Now, Valencia is a leading figure in the legal marketing space, helping law firms craft an effective digital marketing strategy.
In this episode of The Search Engine Journal Show, let’s get to know more about Alex’s inspiring life and career story, his insights on legal SEO and content marketing, and so much more.
Alex Valencia is the President at We Do Web Content, the content-focused digital marketing agency he co-founded with his wife in 2009.
Currently, he’s also a contributor to Search Engine Journal.
He’s also contributed to legal publications, including PILMMA’s Insiders’ Journal, and spoken at conferences, like the PILMMA Super Summit and the Trial Lawyers Summit, among others.
He also hosts the SEO Happy Hour podcast.
Listen to this episode and learn why simply writing blog posts isn’t always a smart idea for law firms, the art of doing a webinar, and building relationships, among others.
How to connect with Alex Valencia:
Twitter | LinkedIn | Instagram
This podcast is brought to you by Ahrefs and Opteo.
Visit our podcast archive to listen to other Search Engine Journal Show podcasts!
Image Credits
Featured Image: Paulo Bobita
Danny Goodwin is the former Executive Editor of Search Engine Journal. He formerly was managing editor of Momentology and editor …
Conquer your day with daily search marketing news.
Join Our Newsletter.
Get your daily dose of search know-how.
In a world ruled by algorithms, SEJ brings timely, relevant information for SEOs, marketers, and entrepreneurs to optimize and grow their businesses — and careers.
Copyright © 2024 Search Engine Journal. All rights reserved. Published by Alpha Brand Media.
Recovering from a Google Core Algorithm Update with Lily Ray [PODCAST] – Search Engine Journal
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Maximize your SEO efforts in 2024 with insights on Google’s SGE, algorithm updates, and expert tips to keep your site ahead.
Download this guide and learn how to optimize and manage Google Performance Max campaigns, with expert insights and actionable strategies to ensure your campaigns are effective.
Discover the latest trends, tips, and strategies in SEO and PPC marketing. Our curated articles offer in-depth analysis, practical advice, and actionable insights to elevate your digital marketing efforts.
Join three of Reddit’s top executives in this exclusive AMA (Ask Me Anything) to discover how you can tap into Reddit’s unique platform to drive brand growth.
Join this leadership discussion for proven techniques to build long-term relationships and keep your clients coming back.
Lily Ray of Path Interactive talks about what to do when you’re looking to recover from a Google core update or your SEO performance is declining.
Podcast: Download
Subscribe: Apple Podcast Google Podcasts Spotify
For episode 176 of The Search Engine Journal Show, I had the opportunity to interview Lily Ray, the SEO Director of Path Interactive.
Ray, a sought-after conference speaker, talks about what to do when you’re looking to recover from a core update or a declining SEO performance.
Lily Ray (LR): Yeah, I don’t buy into that because my team and I here we help clients recover. But that being said, it’s very, very difficult so I understand why Google says that.
And another thing they always say is like, “We tell people that there’s nothing you can do because we don’t want webmasters to go out there and just frantically change a bunch of things that maybe weren’t actually problems, think more long term than that.”
So I do think recovery is possible, but I think it requires a really, really heavy investment in resources and in time and a lot of patience as well.
One thing that we see a lot of is Google rolls out these core updates several times per year, but maybe for two or three core updates after you’ve been addressing some of the problems with your website, you might not see any immediate impacts in performance or positive impacts in performance.
You might even see some negative ones, which can be really disheartening. But, over time, if you invest the right time and energy and focus on the right things, you will ultimately see a recovery.
It might not be a full recovery, it might just be partial. But we’ve seen time and time again that it’s possible to recover.
Brent Csutoras (BC): So obviously seeing a decline is concerning, for company owners and businesses themselves but also for the people who are managing those offices or those initiatives… So how do you approach [these changes]? What would you say are the beginning steps to assess why you’re seeing a change?
LR: I think the first thing we like to look at is which algorithm update affected them.
Maybe they’ve seen improvements over the last couple of ones from 2018 but then in 2019 they started to see some big negative declines on. We like to assess maybe what happened on those dates.
So we start with that and then we dig into the data and obviously look at what’s really happening with the sites.
Using Google Search Console, for example, you can get a really good glimpse of which particular pages were affected, which keywords were affected.
Google talks a lot about the fact that it really has to do with relevancy.
So it might be that your website’s perfectly fine, but they’ve kind of recalibrated something in terms of what’s relevant for that query and your website or your webpage might not be the relevant thing anymore.
So we gather data about:
So it’s very case by case, but you start with a high-level theory and then you really have to dig into the data to see what’s really happening.
LR: The thing about recovering from core updates is that it’s very multifaceted. It requires looking at a lot of different components of what’s affecting your website simultaneously.
So there’s no silver bullet, there’s no singular thing that you can do to recover, unfortunately, which makes it hard for somebody who doesn’t have a depth of experience in recovery to address some of the problems that might be affecting the website.
But what I like to tell people and what we kind of do here at my agency is we start by doing like a gut check and really asking ourselves like, “Is this truly a high quality website? Is this content truly helpful?”
Because what can happen is you get caught up in thinking your own content is great or thinking that your own SEO strategies are great – because maybe they worked for five or 10 years – and this is actually becoming more and more true in the past couple of years with the algorithm updates.
Some of these sites have been enormous in terms of their market share and how successful they’ve been with SEO. They have a whole team of writers that are writing in the style that they’ve learned works for them from an SEO standpoint.
And suddenly those strategies stop working, which is kind of terrifying. So what we do a lot of is like the clients will come to us and say, “We’ve done everything. We’ve had a great SEO strategy and a great SEO team and we’ve been doing this and it’s been working for us for years.”
And then we kind of start to get under the hood and we say, “Actually like this doesn’t work anymore or this never should have worked in the first place or your content is maybe not as good as you think it is.”
So it’s a lot of tough conversation.
LR: In my experience what happens is when you start to really dig into what’s happening with the website, maybe the technical performance, some of the history and the other strategies that they’ve used throughout the years, particularly as it relates to links.
You’ll start to uncover like, “OK, there’s a bigger problem here than maybe we realized.”
Like we’re working with a site right now where we keep having to ask them questions about their backlink profile because, from their perspective, “No, no, everything we’ve done was legitimate. We worked with the best legitimate companies to pay for links.”
Which is like the key term, right? And we’re like, “You know what? Like I think maybe Google’s getting smarter about that over time. So maybe that worked for you a couple of years ago, but it’s not going to work anymore.”
And their perspective is that everything they’re doing is great because it used to be great.
So it’s tricky, but I think it requires knowing the direction that Google’s algorithm is going in. And I think that’s made a lot of big changes and big leaps in the last couple of years.
We know from the Search Quality Raters Guidelines where they’re trying to go with the algorithm and a lot of the times the things that are in those guidelines don’t jive with the strategies that companies have been using for the last 10 or 15 years.
LR: It’s a great question, I don’t think there’s a specific number and it really depends on your industry.
I’ve seen some sites that they don’t publish very often and they just have a small handful of really meaningful and helpful evergreen pieces of content and that’s all they need. And maybe they build one new one a month or one or two new ones a month or something along those lines.
I think this notion that a lot of companies have, which is we need a new blog constantly. We need one a week or we have this kind of editorial calendar that gets you into a situation where you have a lot of content that probably isn’t performing very well and you might not be auditing that all the time.
And that’s one of the things we look at when we’re recovering from core updates is like do you have 10,000 articles that are not really doing anything?
It might even be bringing down the overall quality of your site. So I think it depends on your industry and the demand for content, but more content is not always better.
More Resources:
Visit our podcast archive to listen to other Search Engine Journal Show podcasts!
Image Credits
Featured Image: Paulo Bobita
Managing Partner / Owner at Search Engine Journal with over 18 years experience in Digital Marketing, specializing in Reddit, Search …
Conquer your day with daily search marketing news.
Join Our Newsletter.
Get your daily dose of search know-how.
In a world ruled by algorithms, SEJ brings timely, relevant information for SEOs, marketers, and entrepreneurs to optimize and grow their businesses — and careers.
Copyright © 2024 Search Engine Journal. All rights reserved. Published by Alpha Brand Media.