You are reading the article Your Guide To Google’s Exact Match Domain Algorithm Update updated in December 2023 on the website Cattuongwedding.com. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested January 2024 Your Guide To Google’s Exact Match Domain Algorithm Update
The EMD, or Exact Match Domain update, was a 2012 Google algorithm update that targeted exactly what it is named: exact match domain names.
The intent behind this update was not to target exact match domain names exclusively, but to target sites with the following combination of spammy tactics: exact match domains that were also poor quality sites with thin content.
There were really no other nicknames for this update. It went by the EMD update, or Exact Match Domain update.
The major weakness that these websites had was the fact that SEOs would buy domains with exact match keyword phrases, build a site out, but have extremely thin content with little to no value on them. It was extremely easy to do. So easy, in fact, that it was almost like taking candy from a baby in terms of easy SEO wins.
Before the update happened, Google’s Matt Cutts warned the industry that this was going to be a focus of a future update in a webmaster video published on March 7, 2011.
Dr. Pete Meyers at Moz published an article just prior to the release of the EMD update, asking whether EMDs were in decline. His accompanying data showed a sharp decrease in the influence that exact match domains had on actual rankings.
Cutts announced the EMD update on Twitter on September 28, 2012:Industry Reactions & Discussions
Here are some industry reactions (for more, check out this thread on WebmasterWorld.
It makes me wonder what they mean by “Low Quality”.
-The second is not a quality site but not terrible. It is a two word keyword phrase .org without dashes. I could see it not being on page one based on it’s content, but there is no reason for it to be outside of the top 100. It is not that bad.The EMD Update Was to be Run Periodically
Danny Sullivan at SearchEngineLand wrote that Google confirmed that the EMD algorithm is going to be run periodically, so that those that have been hit stay filtered or have a chance to escape the filter. This is also in place so that Google can catch what they may have missed during the last update.Spammy Sites & Thin Content Were Big Hits
The goal of this update was to target spammy sites and sites with significant amounts of thin content providing little value beyond exact match words in the domain. Sites with stronger brand recognition and high quality content were less likely to be hit.What If You Were Hit By the EMD Update?
If you were hit by the update, and you wanted to recover, the general consensus of the proper plan of attack was very similar to Panda:
You would want to get rid of and/or improve the poor quality content. Please note: Google has officially recommended that improving poor quality content is mostly the best solution, as opposed to removing low quality pages entirely.
A link profile audit would be beneficial to identify spammy inbound links that had low trust signals, and engage in a link remediation/removal campaign to remove them.
Then, revising your routine to add new custom, amazing content to your site every day.
And finally: engage in an SEO link acquisition campaign to increase your website’s trust and authority. After the removal of the bad links, continued acquisition would be necessary.Was There a Pattern to Sites Hit By This Update?
Dr. Pete at Moz released a detailed study on this update. Based on his measurements of who got hit, approximately 41 EMDs fell out of the top 10, with a net change of about 36 domains after 5 new EMDs entered the top 10). One example site that fell out of the top 10 had an exact match domain. While it wasn’t exactly a spammy site, it was a fairly decent site whose only crime was choosing a keyword rich domain over the branded domain.
The majority of sites in the mix did seem to have more signals that communicated lower quality – things like low-authority, spammy link acquisition, aggressive keyword use, etc. – and they appeared to be ranking just because of the fact that they had an EMD.
There was no clear pattern discernible by Dr. Pete’s data, and thus most SEOs were forced to assume that multiple factors not identified publicly by Google were being weighed by them.EMD Update Case Studies
In August 2023, Mark Preston published a case study that tracked the effectiveness of exact match domains and local SEO. The study was inconclusive, but results seemed to point towards at least some effectiveness of exact match domains remaining today.
This case study in late 2014 examined the effectiveness of exact match domains.
Result: It is the quality that matters. If you use an exact match domain, make sure to build out the site with quality content and quality link acquisition. Focus on the major sustainable quality factors that will help you build a successful site, and you should not incur a penalty or other negative action as a result of your efforts. These factors are: quality, uniqueness, authority, relevance, and trust.Updates From Google Since 2012
Do Exact Match Domains Help or Hurt Google Rankings? John Mueller Weighs In.Lingering Myths & Misconceptions About the EMD Update
Despite Google’s public stance about the EMD update closing the exact match domain name optimization loophole, some misconceptions remain that buying exact match domains actually works.
Well, it does, to a point.
When purchasing an exact match domain, most webmasters today have good intentions. They have higher quality standards when it comes to building websites. They’ve learned from their mistakes, and there won’t always be situations where these sites are built just for the sake of the EMD.
This author believes that the situation where you have a real gray hat SEO going after exact match domains with gusto is something that is extremely rare nowadays. The filter has been in place, (almost) everyone learned from their mistakes, and most (if not all) webmasters want to stay on Google’s good side. This author thinks that there is less of a misconception nowadays than there used to be.
The biggest misconception back then was that Google was going after all exact match domains. They had in fact stated they were going after low-quality exact match domains, so not all exact match domains were affected negatively by this update. It is important to define the distinction here, because this misconception still continues to linger.How Did the EMD Update Change SEO?
Even back in 2012, using exact match domains was never a valid technique per Google’s webmaster guidelines. It was considered gray hat, because you were technically manipulating the search results even though it was never clearly black hat.
Because of the prevalence of the technique at that time, Google felt it was necessary to close this loophole.
There were many issues that Panda attempted to solve when it came to sites with low quality content. But, when it came to exact match domains, Panda was not doing much of a great job.
The arrival of the EMD update meant that SEOs could not just purchase a site with exact-match domains, build out some low quality content, and let it sit there and call it a day.
This update meant that SEO was about to get more complex, not quite as easy, and everyone knew that easy ranking wins for clients were over. Now, it would be necessary to plan out everything: on-page SEO, content marketing, link acquisition, everything. It would not be as simple as building out a site instantly and seeing almost immediate rankings benefits.
Resources & Further Reading on Google’s EMD Update
You're reading Your Guide To Google’s Exact Match Domain Algorithm Update
Exact match hasn’t meant ‘exact’ for quite some time but last week Google announced that they are further expanding the broadening of exact match keywords.
And as you know, I’m all about automation.
The analysis I’ll describe is trivial to do through a robust PPC management tool. Read on for a free Google Ads script that you can use to do the analysis quickly in one account or hundreds.A Positive Impact (On Average)
While Google provides the usual reassurance that the typical account will see benefits from this change, we all know that no account is average.
So we need to make sure that the impact we’ll see for each of the unique accounts we manage will be a positive one.What Is Changing About Exact Match?
In 2014, plurals and misspellings were added as ‘close variants’ to phrase and exact match keywords.
Examples provided by Google include scenarios where additional words are implied, where a term paraphrases the keyword, or where the words indicate the same intent.
Whereas plurals and misspellings were fairly straightforward to understand and to some degree predict, similar intent is broader and may warrant paying closer attention.
The other part worth paying attention to is that similar intent may not equate to similar value.
For reasons that can be hard to grasp, even minor differences between keywords can equate to big differences in conversion rates.
This is even true for plurals and singulars so it’s definitely a good idea to confirm with your data that using the word ‘campsites’ vs ‘camping’ performs similarly. If not, then they should be managed as separate keywords with different bids.Query Management Is a Must
Specifically, you need to make sure that your process of periodically evaluating queries continues to be done.
This process will help you find new negative keywords as well as high-quality queries that should be added as managed keywords.Should You Be Worried? Let’s Find Out!
This type of change is causing some buzz.
Hey, maybe you’re even a bit uneasy because you’d like to look beyond those hypothetical examples we got from Google about keywords for “camping in Yosemite”.
So let’s take a look at how this change is impacting your account.1. Using the Ads Interface to Investigate
To get a sense of the impact this change will have on your account, and how misspellings and plurals are already impacting your account since the change in 2014, you can refer to the Search Terms report in Google Ads.
Be sure to add the Keyword column so you can see which keyword triggered a particular search term.This Report Has a Few Shortcomings
The Match Type column refers to how the keyword matched the query, and not the match type of the keyword itself. This is one of those nuances in Google Ads; match type can refer to two very different things.
So the only way to see the match type of the keyword is to look at the special characters in the Keyword column. For example, square brackets around the keyword mean it’s an exact match keyword.
This limitation makes it a bit harder to do a quick analysis of how exact match keywords are getting matched to close variants. And if you try to filter the keywords that contain the text ‘[‘, Google says there are no matches since the brackets are not technically part of the keyword text.
The other limitation I see is that the report only contains the performance metrics of the queries. And while you can certainly use this data to weed out low performing queries, I like doing a slightly deeper analysis that also takes into account the relative performance of the query compared to that of the keyword.
The methodology is as follows:
Download a keyword performance report (including the keyword match type).
Download the search terms performance report.
Do a VLOOKUP to match every search term to the keyword that triggered it.
Get all the keyword and query data for each query into individual rows.
Filter the data to do the analysis.
So let’s go and automate this.3. Analyze the Close Variant Impact on Your Ads with Google Ads Scripts
Thanks to Google Ads Scripts, you can automate the analysis so that you can easily replicate it for other accounts you manage.
Another nice benefit of scripts is that if you find the need to add negative keywords, you can automate that by adding a few more lines of code to the script.
The script also adds a match subtype column where I consider BMM (broad match modifier) to be a unique match type that is different from broad match. (Google doesn’t consider BMM to be its own match type).Script Settings
Really the only things you should edit are the email addresses that need to get an email when a report is ready and the usernames of everyone who should be allowed to access the report that is generated in Google Sheets.
So update the variables ‘emailAddresses’ and ‘accountManagers’ and leave everything else as-is unless you’re familiar with scripts and you know what you’re doing.
var time = ‘LAST_30_DAYS’;
var reportVersion = ‘v202302’;The Output of the Script
Here’s an example of the data you’ll get:
I’ve already used filters in Sheets to see only exact match keywords that were matched to a close variant. In this account the only variants are typos and plurals.
I’m going to continue monitoring this account with the script to find out what words Google considers as having the same intent.Counting the Proximity Between the Query & Keyword
In the output, I wanted a way to more easily see how aggressive close variants are. In other words, how far they are from the keyword.
I figured one way to do this analysis is by counting the number of differences between the query and the keyword.
The Levenshtein distance seemed like a good measure to use as it counts the number of characters that need to be changed to transform one string (the keyword) into another string (the query).
In a pluralization, you would expect the difference to usually be one character (the addition or removal of the letter ‘s’ in English). Typos will usually consist of somewhere between 1 and 3 incorrectly typed characters.
So by looking at variants where the Levenshtein distance is in the range of 1 to 3, I can find the typical close match variants Google has now been doing for several years.
By looking for higher distances, I will be able to find where the words have been changed to ones with similar intent.Conclusion
As Google is always updating its ad system, it’s critical for the humans overseeing the accounts to take on the role of the pilot who oversees that the automation is doing its job well.
Tools like Google Ads scripts are a great way to make that job easier by pointing out potential issues so that the account manager doesn’t need to merely trust the automation, nor check it manually.
I hope my script helps you do your job better and in less time.
More Paid Search Resources:
All screenshots taken by author, September 2023
Google’s John Mueller was asked if “unlinked brand mentions” were important in Google’s algorithm. It was apparent from John’s response that “brand mentions” is probably not a real thing in Google’s algorithm, but he also said that there may be value to site visitors who encounter them.Brand Mentions
There is a longstanding idea in the SEO community that Google uses mentions of a website as a form of link.
The unlinked URL idea subsequently evolved into the idea that if a website mentions another site’s brand name, that Google will also count that as a link. This is the “brand mentions” idea.
But there was never any evidence of that until around 2012 when Google published a patent called Ranking Search Results.
The patent was several pages long and buried deep in the middle of it was the mention of an “implied link” being used as a type of link, which was different from an “express link” which is described as a traditional hyperlink.
The phrase “implied links” only occurs a couple times in this one paragraph.Google’s John Mueller Discussing Unlinked Brand Mentions Two Main Ranking Factors Discussed in the Patent
To understand what the authors meant by an implied link you have to scroll up the page back to a section labeled “Background” where the authors explain what the patent is actually about.
These are the two most important factors discussed in the patent:
The authors explain they are using independent links to a website as part of the ranking process. They call the site being linked to a”target resource.”
The authors also say that they are ranking search results by using search queries that contain a reference to a website, what they again call a “target resource.”
The patent explains without ambiguity that this second type of link is a search query that uses a brand name, what the SEO industry calls Branded Search Queries.
Where the patent makes a reference to a “group of resources,” it is referring to a group of web pages.
A resource is a web page or a website.
A group of resources is a group of web pages or websites.
One more time:
When the patent mentions a “resource” it’s talking about web pages or websites.
The patent states:
“A query can be classified as referring to a particular resource if the query includes a term that is recognized by the system as referring to the particular resource.
For example, a term that refers to a resource may be all of or a portion of a resource identifier, e.g., the URL, for the resource.
The above explanation defines what the authors call “reference queries.”
A reference query is what the SEO community refers to as branded search queries.
A branded search query is a search someone performs on Google using a keyword plus the brand name, the domain of a website or even a URL, which is exactly what the patent defines as reference queries.
What the algorithm described in the patent does with those “reference queries” (branded search queries) is to use them like links.
The algorithm generates what’s called a “modification factor” which modifies (re-ranks) the search results according to this additional data.
The additional data is:
1. A re-count of inbound links using only “independent” links (links not associated with the site being ranked.)
2. Reference queries (branded search queries) are used as a type of link.
Here is what the patent states:
“The system generates a modification factor for the group of resources from the count of independent links and the count of reference queries…”
What the patent is doing is it is filtering out some hyperlinks in order to only use independent links and also to use branded search queries as another type of link, what can be defined as an implied link.How the Idea of Brand Mentions Was Born
Some in the SEO community took one paragraph out of context in order to build their “brand mentions” idea.
The paragraph begins by talking about using independent links for ranking search results, just as is described in the background section of the patent.
“The system determines a count of independent links for the group (step 302).
A link for a group of resources is an incoming link to a resource in the group, i.e., a link having a resource in the group as its target.”
The above statement matches exactly what the entire patent talks about, independent links.
The next section is the part about “implied links” that has confused the search industry for the past ten years.
Two things to note in order to more easily understand what is written:
A “source resource” is the source of a link, the page that is linking out.
A “target resource” is what is being linked to (and ranked).
This is what the patent says:
“Links for the group can include express links, implied links, or both.
An express link, e.g., a hyperlink, is a link that is included in a source resource that a user can follow to navigate to a target resource.
An implied link is a reference to a target resource, e.g., a citation to the target resource, which is included in a source resource but is not an express link to the target resource.
Thus, a resource in the group can be the target of an implied link without a user being able to navigate to the resource by following the implied link.”
The key to what an “implied link” is contained in the very first mention of the phrase, implied link.
Here it is again, with my emphasis:
“An implied link is a reference to a target resource…”
Clearly, the use of the words “reference” is the second part of what the patent talks about, reference queries.
The patent talks about reference queries (aka branded search queries) from the beginning to the end.
In retrospect it was a mistake for some in the SEO industry to build an entire theory about brand mentions from a single paragraph that was removed from the context of the entire patent.
It’s clear that “implied links” are not about brand mentions.
But that’s background information on how “brand mentions” was popularized.John Mueller on Unlinked Brand Mentions and Google’s Algorithm Question About Unlinked Brand Mentions
The question about brand mentions had a lot of background information to unpack. So thanks for sticking around for that because knowing it is helpful to understanding the question and John Mueller’s answer.
Here is the question that was asked:
“In some articles I see people are speaking about unlinked brand mention.
I want to know your opinion in this case.
Do you think it’s also important for algorithm, unlinked brand mention?”Are Brand Mentions Important to Google’s Algorithm?
The concept of “brand mentions” appeared to be unclear to John Mueller.
So Mueller, asked a follow up question:
“How do you mean, “brand mentions?”
The person asking the question elaborated on what he meant:
“It’s like another website and article speaking about my website brand, but it doesn’t link to me.”
John Mueller answered:
“I don’t know.
I think that’s kind of tricky because we don’t really know what the context is there.
I mean, I don’t think it’s a bad thing, just for users.
Because if they can find your website through that mention, then that’s always a good thing.
But I wouldn’t assume that there’s like some… I don’t know… SEO factor that is trying to figure out where someone is mentioning your website name.”Brand Mentions Are Not an SEO Factor
John Mueller confirmed that brand mentions are not a search engine optimization factor.
Given that the foundation of the “brand mentions” idea is built on one paragraph of a patent that’s been taken out of context, I would hope the SEO community will set aside the idea that “brand mentions” are an SEO factor.
Mueller did say that brand mentions can be useful for helping users become aware of a website. And I agree that’s a good way to think about brand mentions as a way to get the word out about a website.
But brand mentions are not an SEO factor.Just Because it’s in a Patent Doesn’t Mean it’s in Use
One last note about the patent that mentions “reference queries.”
It’s important to understand that something isn’t necessarily in use by Google just because it appears in a patent or a research paper.
Google could be using it or maybe not. Another consideration is that this is an older patent and Google’s search algorithm is constantly changing.Citations Read the Patent from 2012
Ranking Search ResultsWatch Google’s John Mueller Answer About Brand Mentions
Watch at 12:01 minute mark:
Unless you’ve been living under a rock, metaphorically of-course, you must have read the news that Google doesn’t provide a free Google Apps plans anymore. Which means that if you want to host email for your domain with Google Apps, it is going to cost you $5/user/month. Millions of individual domain owners had been using Gmail for their email needs and will now need to either switch providers or lose access to their emails.
Thankfully, there are other options out there and one of them is Microsoft’s new and shiny chúng tôi service. Read on to find out how to use chúng tôi to host your email for your domain.
The first thing that you need to do is sign up for a chúng tôi account.
Enter your domain name in the text box, select the radio button that says “Set up chúng tôi for my domain” and hit Continue.
DNS Settings? What’s that?
Let’s take a step back to understand how email is delivered. DNS is the domain name system that lets you browse over to any website on the Internet by just using it’s domain name. It comprises of a bunch of different entries (records) for each domain name on the Internet. The servers that maintain these records are called DNS servers.
Each domain has an associated MX record. The MX record points to the server that is supposed to accept emails for that particular domain. This record is the one that has to be changed if you want chúng tôi to manage your email instead of Google or any other provider that you were using earlier. If you don’t know how to edit the DNS settings for your domain, head over to your domain registrar’s website and their tech support guys will be happy to handle them for you.
Once you’ve made the above changes, DNS systems around the world are notified of the change to your domain. This process happens automatically and your new settings can take a couple of hours to propagate throughout the Internet.
After a few hours, refresh the page and Outlook will detect the DNS changes you made and let you add users to your account. Add new users to your account from the “Member Accounts” link and start using your new chúng tôi email.
Wasn’t that easy ? Now, enjoy your free email service without all the hassles of managing an email server.
Sharninder is a programmer, blogger and a geek making a living writing software to change the world. His tech blog, Geeky Ninja, is where he shares his wisdom, for free !
Subscribe to our newsletter!
Our latest tutorials delivered straight to your inbox
Sign up for all newsletters.
Google’s John Mueller was asked in a Webmaster Hangout what to do if a site is suffering a traffic loss due to Google’s June 2023 broad core algorithm update. John Mueller’s answer provided insights into understanding what is happening.
Then Mueller provided hope that Google may offer further guidance on what to do.Webmaster Asks If It’s a Content Issue?
The person making the question states they’re a news publisher. They ask that because they deal in content, that it may be that the core update issue for them is content related.
Here is the question:
“We’re a news publisher website, primarily focusing on the business finance vertical. we probably have been impacted by the June Core Update as we’ve seen a drastic traffic drop from the June 1st week.
Agreed that the update specifies that there are no fixes and no major changes that need to be made to lower the impact.
But for a publisher whose core area is content news, doesn’t it signal that it’s probably the content, the quality or the quantity which triggered Google’s algorithm to lower down the quality signal of the content being put up on the website which could have led to a drop of traffic? “
The questioner states that webmasters need more guidance:
Not site specific, but category or vertical specific at least on how to take corrective measures and actions to mitigate the impact of core updates.
It would go a long way in helping websites who are now clueless as to what impacted them.”Why Nothing to Fix
John Mueller did not suggest fixing anything specific. He explained that the reason there’s nothing specific to fix is because a core update encompasses a broader range of factors.
Google’s John Mueller explains:
“I think it’s a bit tricky because we’re not focusing on something very specific where we’d say like for example when we rolled out the speed update.
That was something where we could talk about specifically, this is how we’re using mobile speed and this is how it affects your website, therefore you should focus on speed as well.”Core Update, Relevance and Quality
John Mueller then discussed the core updates within the context of relevance and quality updates. He did not say that core algo updates were specifically just about relevance or just about quality. He seemed to mention those to aspects as a way to show how these kinds of updates do not have specific fixes.
Here is how John Mueller explained it:
“With a lot of the relevance updates, a lot of the kind of quality updates, the core updates that we make, there is no specific thing where we’d be able to say you did this and you should have done that and therefore we’re showing things differently.”
John Mueller then explained, as an example, of how changes that are external to a website could impact how Google ranks websites.
This is what he said:
“Sometimes the web just evolved. Sometimes what users expect evolves and similarly, sometimes our algorithms are, the way that we try to determine relevance, they evolve as well.”
That may be the most a Googler has said so far to explain about core algorithm updates.
John mentions quality, but he also mentioned how users and the web evolve. That’s not a quality issue. Those are factors that are external to a website that need to be considered.Nothing to Fix
John Mueller related that there aren’t specific things to fix. But he suggested that it may be useful to understand how users see your site, how useful your site is to users.
Here’s what John Mueller said:
“And with that, like you mentioned, you’ve probably seen the tweets from Search Liaison, there’s often nothing explicit that you can do to kind of change that.
What we do have is an older blog post from Amit Singhal which covers a lot of questions that you can ask yourself, about the quality of your website. That’s something I always recommend going through. , That’s something that I would also go through with people who are not associated with your website.”
John Mueller may have been citing a Webmaster Central blog post from 2011 titled, More Guidance on Building High-quality Sites
In it, the author provides a large number of questions a site owner should ask themselves about their content.
Here is a sample of the kinds of questions Google suggests you should ask yourself:
“Would you trust the information presented in this article?
Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Would you be comfortable giving your credit card information to this site?
Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
Does the page provide substantial value when compared to other pages in search results?”Ask a Third Party For a Critique
John Mueller then suggested that a third party that is unfamiliar with your site may be able to see issues that are not apparent to you.
What John Mueller said:
“So, often you as a site owner you have an intimate relationship with your website you know exactly that it’s perfect. But someone who is not associated with your website might look at your website and compare it to other websites and say, well, I don’t know if I could really trust your website because it looks outdated or because I don’t know who these people are who are writing about things.
All of these things play a small role and it’s not so much that there’s any technical thing that you can change in your line of HTML or server setting.
It’s more about the overall picture where users nowadays would look at it and say, well I don’t know if this is as relevant as it used to be because these vague things that I might be thinking about.
So that’s where I’d really try to get people who are un-associated with your website to give you feedback on that.”
John Mueller suggested asking in web communities, including the Webmaster Help Forums, to see how others see your site, if they could spot problems.
One issue with that is that every community have specific points of views that sometimes don’t allow them to get past their biases to see what the real problem is. That’s not a criticism but an observation on the nature of opinions is that they tend to vary.
Here’s what he said:
“…you can talk with other people who’ve seen a lot of websites and who can look at your websites and say well, I don’t know the layout looks outdated or the authors are people that nobody knows or you have stock photos images of instead of author photos. It’s like, why do you have that?
All of these things are not explicit elements that our algorithms would be trying to pinpoint but rather things that kind of combine to create a bigger picture.”
I know from experience that it’s not uncommon for a site owner who comes to me for help with their site is sometimes surprised that their site contains problems with their content, is outdated in some way or has room for improvement in the way the content is written.
Sometimes they intuit that something is wrong but they can’t see it. I once had a site owner come to me with a negative SEO problem but the feedback I received directly from Google was that they were suffering from content issues related to Google’s Panda algorithm.
It was a shock for them to hear that their content was bad. But having it confirmed by Google made them better able to see that yes, there were problems with the content.Google May Provide Additional Official Guidance
Mueller then offered hope by suggesting he would inquire about providing additional guidance for web publishers.Takeaway: See the Big Picture
The important takeaways are to be able to step back and see the big picture, which means:
Some issues are external to the website. For example, many fashion brands no longer publish blogs. An SEO recently attributed that to a failure in the content strategy. But that’s missing the big picture. The reason many fashion brands no longer publish blog posts is because users don’t consume that kind of content. They consume content on Instagram, Facebook, Twitter and other social media websites.
That’s an example of how users evolve and how it’s not a problem with your site, but rather a change in user habits that may be reflected in the kinds of pages that Google shows in the search results.Takeaway: Algorithms Evolve
Google’s algorithm does not really match keywords to web pages. It’s about solving problems for users. Google’s increasingly updating how it understands what users want when they type a query. Google is also updating how it understands the problems that a web page solves.
A website that focuses too much on keywords and not enough on providing quick information to users who need it quickly and deep information to users who need depth, may find that Google’s algorithms no longer favor them. Not because your site is broken and needs fixing. But because it does not solve the problem for the user in the way Google has determined users want them solved.Takeaway: Have a Third Party Review Your Site
Lastly, it may be helpful to have a fresh set of eyes review your website. If that doesn’t provide insights, then someone with experience diagnosing relevance issues may be useful.
Read: June 2023 Broad Core Algo Update: It’s More than E-A-T
Read: What is a Broad Core Algorithm Update?
Watch: Webmaster Hangout
Screenshots by Author, Modified by Author
The May 2023 core update highlights how different types of businesses are affected.
In the past, Google created and publicized named algorithm updates like ‘Panda’, ‘Penguin’ and ‘Hummingbird’ focused on introducing a major change to improve results quality. Google was fairly open about the goal of an update and the impact it could have.
Today, named updates are less common, instead, you will probably have noticed that Google makes several ‘Core updates’ each year that it may announce through its Search liaison Twitter channel or its Webmaster Tools blog.
In May we have seen a core update notified as usual via the Google Search Liaison Twitter channel that we recommend you follow:
Download our Free Resource – 10 business-limiting SEO mistakes
all ‘hands-on’ marketers need to understand the main factors so they can ask the tough questions of SEO specialists and agencies.
In their post What webmasters should know about Google’s core updates Google explains these like this:
“Each day, Google usually releases one or more changes designed to improve our search results. Most aren’t noticeable but help us incrementally continue to improve.
Sometimes, an update may be more noticeable. We aim to confirm such updates when we feel there is actionable information that webmasters, content producers, or others might take about them.
Several times a year, we make significant, broad changes to our search algorithms and systems. We refer to these as “core updates.”
In the Webmaster tools blog post, Google notes that usually there is nothing wrong with pages that perform less well after a core update. Instead, the changes are about improving Google’s systems to assess content overall.Actions for businesses to take?
Given that the core updates could make a significant difference to your traffic and business you should ensure you routinely track the impact of these through monthly or weekly reviews.
We recommend you either use Google Analytics or Google Search Console to track changes each month or you could implement a dedicated change monitoring dashboard like this created by Google Analytics consultant Aleyda Solis.4 case studies to provide further insight into how google core updates affect businesses in different ways
Article by Glen McCabe has four insightful case studies that show how core updates affect businesses in different ways.
Here are some of the insights from Glen McCabe’s review that identified key areas to be considered:Case 1 – Recovery From The March 2023 Core Update
The first case study is how a site recovered from the March 2023 core update which experienced a 40% drop in traffic.
The site had several key UX issues, including having giant calls to action pushing down the main content throughout the site.
The site combined some important content on one page that should have been on multiple pages.
As the site specialized in the health/medical sector it’s categorized as “Your money or your life” (YMYL). This means, any site that can “impact a person’s future happiness, health, financial stability, or safety” is held to a higher standard. So, it’s important to make sure that content is written by expert-level authors to maintain this level of standard.
Several technical SEO problems were discovered. For example, the site’s internal linking structure was effecting key pages throughout the site from linking directly to other key pages.Case 2 – From Medieval Panda to Core Updates and Beyond
The second case study is an example of how a large-scale, complex site in the news sector addressed overall site improvements and introduced a continuous audit process to help manage core updates.
This case study highlights the importance of identifying important lower-quality URLs that can affect performance to help a site maintain high-quality indexation.
When managing complex sites, more technical SEO issues can occur. So, the lesson from the case study is to continually analyze the site to surface those issues and fix them as quickly as possible. For example, canonical problems, render problems, meta robots tag issues, chúng tôi issues, performance problems, are some technical issues to review.
In line with the E-AT acronym, it is important to recognize the strengths of your content and site performance. Authority was a strength for this news organization, so the action taken was critiquing content to be consistently positioned in this way.Case 3 – The Ghost of Fred And A Reminder To Continually Analyze
The third case study is an example of a site that has experienced challenges from major algorithm updates and how they have recovered from these updates. This is common when sites are operating in a competitive niche where there is very similar content to the competition.
Tough Niche and The Need To Differentiate Your Site
Google organic traffic had dropped by 40% since the May update. An audit on the site identified that too many similarities in content were being featured on the site vs other competitors in the market. This means sites need to differentiate as much as possible from competitors in the market, as it becomes very hard for Google to determine which site should rank well over other players.
The depth of content across the site, specific to the niche sector was identified as a factor affecting performance. When this occurs it is important to review and update low-quality index pages, to support the improvement of the overall quality indexation of the site.Case 4 – Recovery From A Non-Core Update During The May 2023 Core Update
The final case study is of an affiliate site that experienced a 44% drop in traffic, pretty much overnight.
Although the site was centered around an affiliate platform, the CTAs on-site were large and disruptive across the site. This positioning on-site, also affected how content was featured on-site, which most of the time was pushed to the bottom of the page, creating UX issues.
Across the case studies, a recurring theme is on a lack of depth to quality content across sites. This is especially important when reviewing actions to take with google core updates, as low-quality content is evaluated by goggle during updates as well.
Reviewing redirecting and followed links on-site is another area to review during the updates. This example identified redirects set up from several older URLs on-site to a sister site of the company.
Based on the insights from the above case studies. Google core updates can make a significant impact on site performance and these drops in traffic can be instant. As for many marketers, this would be quite an alarming position to be stuck in, if you started work one morning and experienced these drops.
Following the guidelines, regular monitoring of the SEO technical performance of a site and committing to high quality and relevant content across the site appears to be a good start point. Some of the insights shared from the case studies demonstrate areas for marketers to review from the lastest google core update.Additional Resources and tools to help marketers monitor the latest google core update
Aleyda Solis: Analyzing A Recent Google Update Impact via Search Console Data w/ a Segmented Google Data Studio Report
Google Webmaster tools blog: What webmasters should know about Google’s core updates
Update the detailed information about Your Guide To Google’s Exact Match Domain Algorithm Update on the Cattuongwedding.com website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!