What webmasters should know about Google’s “core updates”
Thursday, August 01, 2019
Each day, Google usually releases one or more changes designed to improve our search results. Most aren’t noticeable but help us incrementally continue to improve.
Sometimes, an update may be more noticeable. We aim to confirm such updates when we feel there is actionable information that webmasters, content producers or others might take in relation to them. For example, when our “Speed Update” happened, we gave months of advanced notice and advice.
Several times a year, we make significant, broad changes to our search algorithms and systems. We refer to these as “core updates.” They’re designed to ensure that overall, we’re delivering on our mission to present relevant and authoritative content to searchers. These core updates may also affect Google Discover.
We confirm broad core updates because they typically produce some widely notable effects. Some sites may note drops or gains during them. We know those with sites that experience drops will be looking for a fix, and we want to ensure they don’t try to fix the wrong things. Moreover, there might not be anything to fix at all.
Core updates & reassessing content
There’s nothing wrong with pages that may perform less well in a core update. They haven’t violated our webmaster guidelines nor been subjected to a manual or algorithmic action, as can happen to pages that do violate those guidelines. In fact, there’s nothing in a core update that targets specific pages or sites. Instead, the changes are about improving how our systems assess content overall. These changes may cause some pages that were previously under-rewarded to do better.
One way to think of how a core update operates is to imagine you made a list of the top 100 movies in 2015. A few years later in 2019, you refresh the list. It’s going to naturally change. Some new and wonderful movies that never existed before will now be candidates for inclusion. You might also reassess some films and realize they deserved a higher place on the list than they had before.
The list will change, and films previously higher on the list that move down aren’t bad. There are simply more deserving films that are coming before them.
Focus on content
As explained, pages that drop after a core update don’t have anything wrong to fix. This said, we understand those who do less well after a core update change may still feel they need to do something. We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.
A starting point is to revisit the advice we’ve offered in the past on how to self-assess if you believe you’re offering quality content. We’ve updated that advice with a fresh set of questions to ask yourself about your content:
Content and quality questions
- Does the content provide original information, reporting, research or analysis?
- Does the content provide a substantial, complete or comprehensive description of the topic?
- Does the content provide insightful analysis or interesting information that is beyond obvious?
- If the content draws on other sources, does it avoid simply copying or rewriting those sources and instead provide substantial additional value and originality?
- Does the headline and/or page title provide a descriptive, helpful summary of the content?
- Does the headline and/or page title avoid being exaggerating or shocking in nature?
- Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
- Would you expect to see this content in or referenced by a printed magazine, encyclopedia or book?
- Does the content present information in a way that makes you want to trust it, such as clear sourcing, evidence of the expertise involved, background about the author or the site that publishes it, such as through links to an author page or a site’s About page?
- If you researched the site producing the content, would you come away with an impression that it is well-trusted or widely-recognized as an authority on its topic?
- Is this content written by an expert or enthusiast who demonstrably knows the topic well?
- Is the content free from easily-verified factual errors?
- Would you feel comfortable trusting this content for issues relating to your money or your life?
Presentation and production questions
- Is the content free from spelling or stylistic issues?
- Was the content produced well, or does it appear sloppy or hastily produced?
- Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
- Does the content have an excessive amount of ads that distract from or interfere with the main content?
- Does content display well for mobile devices when viewed on them?
- Does the content provide substantial value when compared to other pages in search results?
- Does the content seem to be serving the genuine interests of visitors to the site or does it seem to exist solely by someone attempting to guess what might rank well in search engines?
Beyond asking yourself these questions, consider having others you trust but who are unaffiliated with your site provide an honest assessment.
Also, consider an audit of the drops you may have experienced. What pages were most impacted and for what types of searches? Look closely at these to understand how they’re assessed against some of the questions above.
Get to know the quality rater guidelines & E-A-T
Another resource for advice on great content is to review our search quality rater guidelines. Raters are people who give us insights on if our algorithms seem to be providing good results, a way to help confirm our changes are working well.
It’s important to understand that search raters have no control over how pages rank. Rater data is not used directly in our ranking algorithms. Rather, we use them as a restaurant might get feedback cards from diners. The feedback helps us know if our systems seem to be working.
If you understand how raters learn to assess good content, that might help you improve your own content. In turn, you might perhaps do better in Search.
In particular, raters are trained to understand if the content has what we call strong E-A-T. That stands for Expertise, Authoritativeness and Trustworthiness. Reading the guidelines may help you assess how your content is doing from an E-A-T perspective and improvements to consider.
Here are a few articles written by third-parties who share how they’ve used the guidelines as advice to follow:
- E-A-T and SEO, from Marie Haynes
- Google Updates Quality Rater Guidelines Targeting E-A-T, Page Quality & Interstitials, from Jennifer Slegg
- Leveraging E-A-T for SEO Success, presentation from Lily Ray
- Google’s Core Algorithm Updates and The Power of User Studies: How Real Feedback From Real People Can Help Site Owners Surface Website Quality Problems (And More), Glenn Gabe
- Why E-A-T & Core Updates Will Change Your Content Approach, from Fajr Muhammad
Note: Links to the articles above are not endorsements of any particular SEO companies or services. We simply found the articles themselves to be helpful content on this topic.
Recovering and more advice
A common question after a core update is how long does it take for a site to recover, if it improves content?
Broad core updates tend to happen every few months. Content that was impacted by one might not recover – assuming improvements have been made – until the next broad core update is released.
However, we’re constantly making updates to our search algorithms, including smaller core updates. We don’t announce all of these because they’re generally not widely noticeable. Still, when released, they can cause content to recover if improvements warrant.
Do keep in mind that improvements made by site owners aren’t a guarantee of recovery, nor do pages have any static or guaranteed position in our search results. If there’s more deserving content, that will continue to rank well with our systems.
It’s also important to understand that search engines like Google do not understand content the way human beings do. Instead, we look for signals we can gather about content and understand how those correlate with how humans assess relevance. How pages link to each other is one well-known signal that we use. But we use many more, which we don’t disclose to help protect the integrity of our results.
We test any broad core update before it goes live, including gathering feedback from the aforementioned search quality raters, to see if how we’re weighing signals seems beneficial.
Of course, no improvement we make to Search is perfect. This is why we keep updating. We take in more feedback, do more testing and keep working to improve our ranking systems. This work on our end can mean that content might recover in the future, even if a content owner makes no changes. In such situations, our continued improvements might assess such content more favorably.
We hope the guidance offered here is helpful. You’ll also find plenty of advice about good content with the resources we offer from Google Webmasters, including tools, help pages and our forums. Learn more here.
Posted by Danny Sullivan, Public Liaison for Search