From algo to aggro: How SEOs really feel about Google algorithm updates

As SEOs working in the weeds with our clients each day, it can sometimes be hard to truly see how major Google algorithm updates affect our industry as a whole. Sure, we can perform test after test to see how our clients are affected, but what about the poor account manager or technical SEO director who has to put in the extra work and placate potentially panicked and frustrated clients? How are they personally affected?

BrightLocal (my employer) anonymously polled 650 SEO professionals recently on this very subject, asking them a host of questions about how algorithm updates impact their workload, their client relationships and their job satisfaction. Below, I’ll go over some of the startling results from our survey, “The Human Impact of Google Algorithm Updates.”

Google update? What Google update?

First, and almost most alarmingly, 36 percent of respondents couldn’t say whether their business or their clients’ businesses have ever been impacted by a Google algorithm update. This should come as a shock — although this isn’t necessarily Day 1 SEO Stuff, it’s certainly Week 1 SEO Stuff.

The high percentage shown here suggests that either Google needs to better communicate the potential effects of an algorithm change (we can dream, right?) and/or SEOs and in-house marketers need to do more to stay on top of updates and investigate whether their clients have been affected by them.

‘And how does that make you feel?’

Of the significant 44 percent who said their business or their clients’ had been affected by algorithm changes, 26 percent say they struggle to know how to react, and 25 percent get stressed when updates happen. (Note: For this question, respondents were able to select multiple answers.) However, on the flip side, an encouraging 58 percent either don’t get worried about updates or are actually excited by the challenge.

It’s perfectly natural for different types of people at different levels of experience to have differing reactions to potentially stressful situations, but 26 percent of respondents say they don’t even know how to react. This means that all the content you put out immediately after a Google update — whether to cash in on suddenly popular “what just happened to the Google algorithm” keywords or to genuinely help SEOs serve their clients better (we’re hoping it’s the latter) — isn’t reaching everyone.

At this point in the Google updates timeline, we should all, as content creators and content readers, be better versed in learning how to react after a Google update.

The penultimate straw

For many, it seems, the camel’s back can very nearly be broken by a surprise Google update. Just over a quarter of respondents said they’d considered leaving the SEO industry because of algorithm updates but ultimately decided to stick around.

It’s worth taking a step back next time an update hits. Take a look around your agency — are your SEO staff or colleagues ready to break? It takes strong leadership and a solid bedrock of skills for an SEO agency to bounce back from a big update, so make sure your best SEOs are made of the right stuff to prepare them for the worst — and, as we’ll see now, it gets bad.

How to lose clients and alienate Google

Nearly a third of respondents who said that Google updates had had an effect on business actually lost clients as a result.

But it’s not all bad news. Twenty-six percent won clients, 23 percent saw the opportunity to grow their work with existing clients, and 29 percent of respondents noticed no change after the update. So there’s quite a lot of positivity to be found here, especially considering respondents were able to choose multiple answers (which could mean that respondents both won and lost clients because of Google updates).

What this ultimately means is that what happens after a Google update is up to you. You can’t point at the above chart and say, “Well, everyone loses clients after a Google update,” because they don’t. The range of responses shows just how much is at stake when an update hits, but it also shows the huge opportunities available to those agencies that communicate with their existing clients quickly and knowledgeably, carefully managing expectations along the way, while also keeping their eye out for businesses who have taken a beating in rankings/traffic and are looking for help.

The client-agency relationship

One final point the survey touched on was the client-agency relationship and how it can be affected by Google updates. A majority agreed that updates make clients more dependent on agencies. (Who knew it? It turns out that every time Google released an algorithm update, they were doing SEOs a favor all along!)

However, with that extra dependency comes extra scrutiny, as seen by the 31 percent of respondents who feel that Google updates lead to clients distrusting agencies. The wisest SEOs in this particular situation are the ones going into client update meetings with clear, transparent overviews of what the client’s money or their time is being spent on, and simplified (but not necessarily simple) explanations of the ramifications of the Google update.

And for the 28 percent who said that Google updates make clients consider changing agency? Well, I hope you do better next time!

What is the first thing you do when an algorithm update happens?

Before I leave you to stew on all that data and start pre-packing your next Google Update Emergency Go-Bag, here are some of the qualitative responses we received to one particular question in the survey, “What is the first thing you do when an algorithm update happens?” May these serve to remind you that whatever happens, no SEO is alone:

The data-divers

“Run ranking reports on all clients.”“Review all the sites that are affected and determine what they have in common. That gives me a starting point as to what has changed.”“Determine which high-volume pages are most impacted, then review existing SEO to try to uncover anything that might be the cause of the traffic from an on-page or technical SEO perspective.”

The researchers

“Read the posts on it to find out what happened and how to react.”“Figure out how I need to change my strategy.”“The first thing I do is research to find out what has been impacted. Next, I inform my team of what to expect from incoming client calls. Following that, I write an article for our blog to include our clients in on the updates.”“Read, read, read everything I can get my hands on.”“Read and study. Then work to fix it.”“Check forums/respected sites to find out as much information as possible.”“Get educated.”“Read as much as I can on what happened/what was affected, then find what it did to my websites/keyword rankings, then rebuild and re-conquer.”“Start reading news releases and blogs from highly respected SEO professionals to try to figure out the changes.”

The vice users

“Grab an adult beverage (or two).”“Drink coffee.”“Smoke a cigarette.”“Go for a few beers.”“Take a Xanax.”

The waiters

“Wait a few weeks while watching the SERPs.”“Nothing, I wait for the algorithm to normalize. I take a look at websites that drop, and websites that increase in rankings. I then compare and contrast my clients’ sites to those. Once I have better understanding of how the algorithm affects sites, I will adjust the strategy.”“Just ignore it for a couple weeks then make adjustments.”

The communicators

“Check for confirmation of update. Assess impact. Communicate with affected clients.”“Share the news with my team and engage them in coming up with a plan.”

 The extremes

“Prepare for the s***-storm ahead.”“Freak out.”“Cry.”

The one person who was actually positive about it

“Celebrate the new consulting opportunities that will result.”

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Google confirms mid-December search ranking algorithm updates

Google has confirmed what many in the search industry have seen over the past week, updates to their algorithm that are significantly shifting rankings in the SERPs. A google spokesperson told Search Engine Land “We released several minor improvements during this timeframe, part of our regular and routine efforts to improve relevancy.”

Our own Barry Schwartz analyzed his Search Engine Roundtable survey of 100 webmasters and concluded that the updates are related to keyword permutations and sites utilizing doorway pages. You can read his full analysis here.

Early signs point to mobile & schema

I reached out to a few of the SEO tool vendors that do large scale tracking of ranking fluctuations to get their sense of where the updates may be targeted.

Ilya Onskul, the Product Owner of SEMrush Sensor gave this analysis:

“SEMrush Sensor follows all the changes that occur on Google SERPs in 6 countries for both mobile and desktop separately. On top of the general volatility score per country, Sensor tracks scores for various industries and indicates the change in 15 SERP features and % of HTTPS and AMP.

Some industries experience more change than the others on daily basis (for example, due to higher competitiveness). Thus, Sensor introduced the Deviation score that analyses which industries had biggest the volatility spikes in relation to their usual score.”

SEMrush Sensor data for all keyword categories (US) – December 20

Based on this data, Onskul concludes “Normally, December is one of the calmest months when it comes to SERP volatility as Google tries to minimize potential impact before big holidays. But something happened around December 14, something that Barry Schwartz called the Maccabees Update, or the pre-holiday update. Sensor spotted the highest SERP volatility on mobile (slightly less on desktop) across most categories, most affected on mobile being Autos & Vehicles, Law & Government, Reference.

In fact, right now, on December 19, Sensor is reporting another extreme spike in volatility. Now, Hobbies & Leisure, Science, Jobs & Education, Home & Garden, Internet & Telecom, have been affected the most. And the biggest fluctuations again take place on mobile.

Of course, it’s too early to come to conclusions on what’s going on and how to adjust to the changes (as we can’t really predict what exactly has changed), but what we know for now is that some new tweaks or updates were rolled out on December 19 for the US, and with a domino effect, the dramatic rise in volatility caught up in the UK, Germany, France, Australia and Spain the next day, which means that the update that was tested on the Google US on December 19 is now spreading across the world.”

We also reached out to Searchmetrics for their analysis and Founder and CTO Marcus Tober noted that they prefer to conduct a deep analysis of algorithmic fluctuations after a sustained change has taken place, saying “At first we saw some changes that at first look looked like typical Panda and Phantom symptoms, but not on a large systematic scale. Many sites have lost visibility that have no Schema.org integration, but we can’t determine based on such a short time what are the overall systematic changes.”

The MozCast continues to likewise show rankings turbulence as the updates roll out:

MozCast for Tuesday, December 19

With the holidays upon us and what would otherwise have been a slow week ahead, now is a good time to check your rankings and start auditing if, where, and why you might see changes.

'Ask Me Anything' with Google's Gary Illyes at SMX East

At last week’s SMX East conference, Google’s webmaster trends analyst Gary Illyes took questions from the dual moderators — Barry Schwartz and Michelle Robbins — as well as from the audience in a session called “Ask Me Anything.”

In this post, I will cover that question-and-answer dialogue, though what you’ll see below are paraphrases rather than exact quotes. I have grouped the questions and used section headers to help improve the flow and readability.

Off-site signals

Barry: You’ve been saying recently that Google looks at other offsite signals, in addition to links, and some of this sounded like Google is doing some form of sentiment analysis.

Gary: I did not say that Google did sentiment analysis, but others assumed that was what I meant. What I was attempting to explain is that how people perceive your site will affect your business, but will not necessarily affect how Google ranks your site. Mentions on third-party sites, however, might help you, because Google looks to them to get a better idea what your site is about and get keyword context. And that, in turn, might help you rank for more keywords.

Imagine the Google ranking algo is more like a human. If a human sees a lot of brand mentions, they will remember that, and the context in which they saw them. As a result, they may associate that brand with something that they didn’t before. That can happen with the Google algorithm as well.

Mobile First, AMP, PWAs and such

Michelle: Where should SEOs focus their efforts in 2018?

Gary: If you are not mobile-friendly, then address that. That said, I believe the fear of the mobile-first index will be much greater than the actual impact in the end.

Michelle: When will mobile-first roll out?

Gary: Google doesn’t have a fixed timeline, but I can say that we have moved some sites over to it already. We are still monitoring those sites to make sure that we are not harming them inadvertently. Our team is working really hard to move over sites that are ready to the mobile-first index, but I don’t want to give a timeline because I’m not good at it. It will probably take years, and even then, will probably not be 100 percent converted.

The mobile-first index as a phrase is a new thing, but we have been telling developers to go mobile for seven years. If you have a responsive site, you are pretty much set. But if you have a mobile site, you need to check for content parity and structured data parity between your desktop and mobile pages. You should also check for hreflang tags, and that you’ve also moved all media and images over.

Michelle: Where does AMP fit? Is AMP separate from mobile-first? Is the only AMP benefit the increased site speed?

Gary: Yes, this is correct. AMP is an alternate version of the site. If you have a desktop site, and no mobile site, but do have an AMP site, we will still index the desktop site.

Michelle: If half a site is a progressive web app (PWA), and half is responsive, how does that impact search performance?

Gary: PWAs are JavaScript apps. If they can render, they will do pretty much the same as the responsive site. However, we are currently using Chrome Version 41 for rendering, and that’s not the latest, so there are newer APIs not supported by V41. If you’re are using those APIs, you may have a problem. Google is working to get to the latest version of Chrome for rendering, which will solve that issue.

Barry: I’ve seen desktop search showing one result and a mobile device showing a different page as an AMP result.

Gary: This happens because of our emphasis on indexing mobile-friendly sites. AMP is an alternate version of the regular mobile page. First, the mobile page gets selected to be ranked. Then the AMP page gets swapped in.

Michelle: So that means AMP is inconsequential in ranking?

Gary: Yes.

Michelle: Will there be a penalty for spamming news carousels?

Gary: We get that question a lot. I do not support most penalties. I (and many others at Google) would like to have algorithms that ignore those things [like spam] and eliminate the benefit. I’ve spoken with the Top Stories team about this, and they are looking into a solution.

Michelle: What about progressive web apps (PWAs)? Do they get the same treatment as AMP, i.e., no ranking boost?

Gary: If you have a standalone app, it will show up in the mobile-first index. But if you have both a PWA and an AMP page, the AMP page will be shown.

Michelle: What if the elements removed from your mobile-first site are ads? [Would that make the AMP version rank higher?]

Gary: Your site will become faster [by adopting AMP and eliminating these ads]. The “above the fold” algorithm looks at how many ads there are, and if it sees too many, it may not let your site rank as highly as it otherwise might. But when we’re looking at whether sites are ready for the mobile-first index, we’re more concerned about parity regarding content, annotations and structured data than ads.

Michelle: What about author markup?

Gary: Because AMP pages on a media site can show up in the news carousel, the AMP team said that you shouldn’t remove the author info when you’re creating AMP pages.

Search Console

Barry: When will SEOs be able to see voice search query information in Search Console?

Gary: I have no update on that. I’m waiting for the search team leads to take action on it.

Barry: How is the Search Console beta going?

Gary: It’s going well. There are a significant number of sites in the beta. We’re getting good feedback and making changes. We want to launch something that works really well. I’m not going to predict when it will come out of beta.

Barry: When will they get a year’s worth of data?

Gary: They have started collecting the data. Not sure if it will launch. The original plan was to launch with the new UI. [Gary doesn’t know if plans have changed, or when the new UI will launch.]

Barry: Why is there no Featured Snippet data in Search Console? You built it, tested it, and then didn’t launch it.

Gary: There is internal resistance at Google. The internal team leads want to know how it would be useful to publishers. How would publishers use it?

Barry: It would give us info on voice search.

Gary: I need something to work with to argue for it (to persuade the team leads internally at Google that it would be a good thing to release).

This question about how the featured snippet data would be used was then sent to the audience.

Eric Enge (your author) spoke from the audience: I’d like to use the data to show clients just how real the move to voice search is. There are things they need to do to get ready, such as understand how interactions with their customers will change.

Michelle: So, that data could be used to drive adoption. For now, that sounds like more of a strategic insight than immediately actionable information.

Gary: The problem is that voice search has been here for a couple of years. Voice search is currently optimized for what we have, and people shouldn’t need to change anything about their sites. Maybe there will be new technologies in the future that will help users.

Michelle: I think that it’s more complicated than that. There are things that you can do with your content that will help it surface better in search, and brands can invest resources in structuring content that can handle conversations better.

Ads on Google and the user experience

Michelle: As you (Google) push organic results below the fold [to give more prominence to ads and carousels] … is that a good user experience?

Gary: I click on a lot of search ads. (Note that Googler clicks that occur on our internal network don’t count as clicks for advertisers, so this costs you nothing.)

I believe that ads in search are more relevant than the 10 blue links. On every search page, there’s pretty aggressive bidding going on for every single position. Since bids correlate to relevance and the quality of the site, this does tend to result in relevant results

Barry: Sometimes the ads are more relevant than the organic results …?

Gary: Especially on international searches.

Michelle: How is that determined?

Gary: This is done algorithmically.

Michelle: How can you compare ads to organic if the two aren’t working together?

Gary: The concept of a bidding process and the evaluation of quality are used by both sides. The separation between the groups is more about keeping the ads people who talk to clients away from the organic people, so they don’t try to influence them. The ads engineering people, they can talk to the organic side; that’s not forbidden.

Ranking factors and featured snippets

Michelle: Does Google factor non-search traffic into rankings?

Gary: First of all, search traffic is not something we use in rankings. As for other kinds of traffic, Google might see that through Analytics, but I swear we do not use Analytics data for search rankings. We also have data from Chrome, but Chrome is insanely noisy.

I actually evaluated the potential for using that data but couldn’t determine how it could be effectively used in ranking.

Barry: What about indirect signals from search traffic, such as pogosticking? Previously, Google has said that they do not use that directly for ranking.

Gary: Yes, we use it only for QA of our ranking algorithms.

Barry: At one point, Jeff Dean said that Google does use them.

Gary: I do not know what he was talking about. The RankBrain team is using a lot of different data sources. There was a long internal email thread on this topic, but I was never able to get the bottom of it.

Michelle: Is RankBrain used to validate featured snippets?

Gary: RankBrain is a generic ranking algorithm which focuses on the 10 blue links. It tries to predict what results will work better based on historical query results. The featured snippets team uses their own result algorithm to generate a good result. I have not looked into what that means on their side. RankBrain is not involved, except that it will evaluate the related blue link.

Barry: Featured snippets themselves are fascinating. You said that they are changing constantly. Please explain.

Gary: The context for that discussion was about future developments for featured snippets. The team is working around the clock to improve their relevancy. The codebase underlying it is constantly changing.

Michelle: Does the device being used by the searcher factor in?

Gary: I don’t think so.

Schema and markup

Gary: I want to live in a world where schema is not that important, but currently, we need it. If a team at Google recommends it, you probably should make use of it, as schema helps us understand the content on the page, and it is used in certain search features (but not in rankings algorithms).

Michelle: Why do you want to be less reliant on it?

Gary: I’m with Sergey and Larry on this. Google should have algorithms that can figure out things without needing schema, and there really should not be a need for penalties.

Michelle: Schema is being used as training data?

Gary: No, it’s being used for rich snippets.

Michelle: Eventually the algo will not need the schema?

Gary: I hope so. The algorithms should not need the extra data.

Barry: Is there a team actively working on that?

Gary: Indirectly, absolutely. It probably involves some sort of machine learning, and if so, it’s the Brain team that works on it. I do not know if they have an active project for that.

Barry: How did you get entity data in the past?

Gary: From Freebase and the Knowledge Graph.

Panda and thin content

Barry: You said that pruning content was a bad idea. If you’re hit by Panda, how do people proceed?

Gary: Panda is part of our core ranking algorithm. I don’t think that anyone in a responsible position at Google thinks of Panda as a penalty. It’s very similar to other parts of the algorithm. It’s a ranking algorithm. If you do something to attempt to rank higher than you should, it basically tries to remove the advantage you got, but not punish you.

Ultimately, you want to have a great site that people love. That is what Google is looking for, and our users look for that, as well. If users leave comments or mention your site on their site and things like that, that will help your ranking.

Pruning does not help with Panda. It’s very likely that you did not get Pandalyzed because of your low-quality content. It’s more about ensuring the content that is actually ranking doesn’t rank higher than it should.

Barry: Pruning bad content is advice that SEOs have been giving for a long time to try and help people deal with Panda.

Gary: I do not think that would ever have worked. It definitely does not work with the current version of the core algorithm, and it may just bring your traffic farther down. Panda basically disregards things you do to rank artificially. You should spend resources on improving content instead, but if you don’t have the means to do that, maybe remove it instead.

Using disavow

Michelle: Should you use disavow on the bad links to your site?

Gary: I have a site that gets 100,000 visits every two weeks. I haven’t looked at the links to it for two years, even though I’ve been told that it has some porn site links. I’m fine with that. I don’t use the disavow file. Don’t overuse it. It is a big gun.

Overusing it can destroy your rankings in a matter of hours. Don’t be afraid of sites that you don’t know. There’s no way you can know them all. If they have content, and they are not spammy, why would you disavow them?

Sites like this are very unlikely to hurt you, and they may help you. I personally trust the Google filters.

Barry: Penguin just ignores the links.

Gary: Penguin does that, too (Gary’s phrase implies that there other algorithms that might filter bad links out, as well).

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

The trouble with 'Fred'

Disclaimer: All criticism of Google spokespeople contained herein is impersonal in nature. I know they are only representing the internal direction of the company and not acting independently. They do strive to be as helpful as they can.

When former head of web spam Matt Cutts was at Google, he spent a lot of time communicating with webmasters/site owners about updates. We knew what was coming, when it might be coming, and how severe it would possibly be.

If you woke up in the morning and your traffic had fallen off a proverbial cliff, you could go to Twitter and, based on what Cutts was posting, usually determine if Google had run an update. You could even tell how severe the rollout was, as Cutts would typically give you percentage of queries affected.

Penguin 2.1 launching today. Affects ~1% of searches to a noticeable degree. More info on Penguin: http://t.co/4YSh4sfZQj

— Matt Cutts (@mattcutts) October 4, 2013

Although some believe Cutts was more about misinformation than information, when it came to updates, most would agree he was on point.

So if a site fell off that cliff, you could learn from Cutts what happened, what the update was named, and what it affected. This gave you starting points for what to review so that you could fix the site and bring it back into line with Google’s guidelines.

Why the help?

Cutts seemed to understand there was a need for the webmaster. After all, Google’s Search is not their product — the sites they return from that search are the product.

Without someone translating Google’s desires to site owners, those sites would likely not meet those guidelines very well. This would result in a poor experience for Google users. So, that transfer of knowledge between Google, SEOs and site owners was important. Without it, Google would be hard-pressed to find a plethora of sites that meet its needs.

Then, things changed. Matt Cutts left to go to the US Digital Service — and with his departure, that type of communication from Google ended, for the most part.

While Google will still let webmasters know about really big changes, like the mobile-first index, they’ve stopped communicating much detail about smaller updates. And the communication has not been in such an easily consumable format as Cutts tweeting update metrics.

In fact, very little is said today about smaller updates. It has gotten to the point where they stopped naming all but a very few of these changes.

Google communication in 2017

Right now, the Google spokespeople who primarily communicate with SEOs/webmasters are Gary Illyes and John Mueller. This is not a critique of them, as they communicate in the way Google has asked them to communicate.

Indeed, they have been very helpful over the past few years. Mueller holds Webmaster Central Office Hours Hangouts to help answer questions in long form. Illyes answers similar questions in short form on Twitter and attends conferences, where he participates in various AMA (Ask Me Anything) sessions with interviewers.

All this is helpful and appreciated… but unfortunately, it is not the same.

Highly specific information is difficult to find, and questioners are often are met with more vagueness than specifics, which can at times feel frustrating. Google has become obtuse in how they communicate with digital marketers, and that seems to be directed by internal company processes and policies.

This lack of algorithmic specificity and update confirmation is how we wound up with Phantom.

Welcome, Phantom

Google has many algorithms, as any SEO knows. Some, like Penguin and Panda, have been rolled into Google’s core algorithm and run in (quasi-) real time, while others, like the interstitial penalty, still run, well, when they run.

Big updates such as Penguin have always been set apart from the day-to-day changes of Google. There are potentially thousands of tweaks to core algorithms that run every year and often multiple times a day.

However, day-to-day changes affect sites much differently than massive algorithm updates like Panda, Penguin, Pigeon, Pirate, Layout, Mobilegeddon, Interstitial, and on and on. One is a quiet rain, the other a typhoon. One is rarely noticed, the other can be highly destructive.

Now, Google is correct in that webmasters don’t need to know about these day-to-day changes unless someone dials an algorithm up or down too much. You might not ever even notice them. However, there are other algorithms updates that cause enough disruption in rankings for webmasters to wonder, “Hey Google, what happened?

This was true for an algorithm update that became known as Phantom.

Phantom?

There was a mysterious update in 2013 that SEO expert Glenn Gabe named “Phantom.” While it seemed to be focused on quality, it was not related to Panda or Penguin. This was new, and it affected a large number of sites.

When “Phantom” ran, it was not a minor tweak. Sites, and the sites that monitor sites, would show large-scale ranking changes that only seem to happen when there is a major algorithm update afoot.

Now, there was one occasion that Google acknowledged Phantom existed. However, aside from that, Google has not named it, acknowledged it, or even denied Phantom when SEOs believed it ran. Over time, this string of unknown quality updates all became known as Phantom.

The word “Phantom” came from the idea that we didn’t know what it was; we just knew that some update that was not Panda caused mass fluctuations and was related to quality.

Not Panda quality updates

The changes introduced by Phantom were not one set of changes like Panda or Penguin, which typically target the same items. However, the changes were not completely disparate and had the following in common:

They were related to site quality.They were not Panda.They were all found in the Quality Raters Guide.

We don’t use the word “Phantom” anymore, but from 2013 to 2016, large-scale changes that were quality related and not Panda were commonly called Phantom. (It was easier than “that update no one admits exists, but all indicators tell us is there.”)

You can’t have so many sites shift that dramatically and tell SEOs the update does not exist. We all talk to each other. We know something happened. Not naming it just means we have to “make up” (educated guess) what we think it might be.

And from this mysterious Phantom, Fred was born.

‘Hello, Fred!’

In early March, 2017, the SEO world was rocked by a seemingly significant algorithm update that appeared to target link quality. Google, however, would not confirm this update, deflecting questions by responding that Google makes updates to its core algorithm nearly every day.

When Search Engine Land’s Barry Schwartz asked Gary Illyes if he cared to name the unconfirmed update, he responded jokingly:

sure! From now on every update, unless otherwise stated, shall be called Fred

— Gary "鯨理" Illyes (@methode) March 9, 2017

‘Fred’ is more than a funny joke

Of course, Fred is not just a funny thing that happened on Twitter, nor is it just the default name for all Google’s future updates. In fact, it is not actually that funny when you break down what it really means. Fred is representative of something far deeper: Google’s historically unstated “black box.”

Now, Google does not use the term “black box,” but for all intents and purposes, that is exactly what “Fred” represents to webmasters and SEOs.

Meet Google’s black box

A black box is when a system’s inputs and outputs (and their general relationships) are known, but

internal structures are not well understood (or understood at all);understanding these structures is deemed unnecessary for users; and/orinner workings are not meant be known due to a need for confidentiality.

To this end, Google has also communicated to SEOs through different channels that they are acting from a black box perspective — the way they used to before Matt Cutts took over Webmaster communications.

We have been told we don’t need to understand the algorithms. We have been told that this knowledge is not necessary to do the work. We have been told that all we need to do to be successful is be awesome. “Awesomeness” will get us where we need to be.

Google's number one search ranking factor according to @JohnMu is… awesomeness https://t.co/Sja8iHdwIv pic.twitter.com/1RfLJ7ZVH4

— Barry Schwartz (@rustybrick) September 6, 2017

This all sounds good. It really does. Just be awesome. Just follow the Webmaster guidelines. Just read the Google Quality Rater’s Guide. You will be set.

Of course, the devil is in the details.

What does ‘awesome’ mean?

Follow the Webmaster Guidelines. Read the Quality Rater’s Guide. Follow these rules for “awesomeness.”

While that advice can help an SEO become awesome on a basic level, it can’t tell one what to do when there is a complex problem. Have a schema implementation issue? What about trying to figure out how to properly canonical pages when doing a site modification or move? Does being awesome tell me how to best populate ever-changing news sitemaps? What about if you get a manual action for that structured data markup because you did something wrong? What about load times?

There are a lot of questions about the million smaller details that fall under “being awesome” that, unfortunately, telling us to “be awesome” does not cover.

This is where the black box becomes potentially detrimental and damaging. Where do you get information about site changes once you have passed the basics of the Webmaster Guidelines and Quality Raters Guide? You saw a change in your site traffic last week; how do you know if it is just your site or an algorithm update if Google won’t tell you?

Being awesome

Google no longer wants SEOs to worry about algorithms. I get it. Google wants you to just be awesome. I get that, too. Google does not want people manipulating their algorithms. Webmaster Guidelines were first written to help stop spam. Google just wants you to make good sites.

The issue is that there still seems to be an unspoken assumption at Google that anyone who wants information about algorithm updates is just trying to find a way to manipulate results.

Of course, some do, but it should be noted most people who ask these questions of Google are just trying to make sure their clients and sites meet the guidelines. After all, there are multiple ways to create an “awesome” website, but some tactics can harm your SEO if done improperly.

Without any confirmations from Google, experienced SEOs can be pretty sure that their methods are fine — but “pretty sure” is not very comforting when you take your role as an SEO seriously.

So, while “being awesome” is a nice idea — and every site should strive to be awesome — it offers little practical help in the ever-changing world of SEO. And it offers no help when a site is having traffic or visibility issues.

So, why is this important?

The lack of transparency is important for several reasons. The first is that Google loses control over the part of product it has never controlled: the websites it delivers in search results. This is not a concern for site owners, but it seems the ability to actively direct sites toward their goals would be something Google would value and encourage.

They have added Developer Guides to make finding SEO/webmaster information easier, but these only help SEOs. Site owners do not have time to learn how to write a title tag or code structured data. These guides also are very high-level, for the most part — they communicate enough to answer basic questions, but not complex ones.

In the end, Google hurts itself by not communicating in greater detail with the people who help affect how the sites in their search results work.

If it is not communicated to me, I cannot communicate it to the client — and you can be assured they are not going to the Developers site to find out. I can also tell you it is much harder to get buy-in from those at the executive level when your reasoning for proposed changes and new initiatives is “because Google said to be awesome.”

If Google doesn’t tell us what it values, there’s little chance that site owners will make the sites Google wants.

Why else?

SEOs are not spammers. SEOs are marketers. SEOs are trying to help clients do their best and at the same time achieve that best by staying within what they know to be Google’s guidelines.

We work hard to keep up with the ever-changing landscape that is SEO. It is crucial to know whether a site was likely hit by an algorithm update and not, say, an error from that last code push. It takes a lot more time to determine this when Google is silent.

Google used to tell us when they rolled these major algorithm updates out, so it gave you parameters to work within. Now, we have to make our best guess.

I think it would be eye-opening to Google to spend a week or so at different SEOs’ desks and see what we have to go through to diagnose an issue. Without any clear communication from Google that something happened on their end, it leaves literally anything that happens on a website in play. Anything! At least when Google told us about algorithmic fluctuations, we could home in on that.

Without that help, we’re flying blind.

Flying blind

Now, some of us are really experienced in figuring this out. But if you are not a diagnostician — if you do not have years of website development understanding, and if you are not an expert in algorithms and how their changes appear in the tools we use — then you could find yourself barking up a very wrong tree while a crippled site loses money.

Every experienced SEO has had a conversation with a desperate potential client who had no idea they were in violation of Google’s guidelines — and now has no money to get the help that they need because they lost enough search visibility to severely hamper their business.

And that leads me to the last but most important reason that this black box practice can be so damaging.

People

People’s livelihoods depend on our doing our job well. People’s businesses rely on our being able to properly diagnose and fix issues. People’s homes, mortgages and children’s tuition rely on our not messing this up.

We are not spammers. We are often the one bridge between a business making it and employees winding up on unemployment. It may sound hyperbolic, but it’s not. I often joke that 50 percent of my job is preventing site owners from hurting their sites (and themselves) unknowingly. During earlier versions of Penguin, the stories from those site owners who were affected were often heartbreaking.

It was horrible 🙁
I lost my ranking 1st to 66th

— Sarkari Naukri (@SarkariNaukri23) March 10, 2017

Additionally, without input from Google, I have to convince site owners without documentation or confirmation backup that a certain direction is the correct one. Can I do it? Sure. Would I like it if Google did not make my job of convincing others to make sites according to their rules that much harder? Yes.

Will Google change?

Unlikely, but we can hope. Google has lost sight of the very real consequences of not communicating clearly with SEOs. Without this communication, no one wins.

Some site owners will be lucky and can afford the best of the best of us who don’t need the confirmations to figure out what needs to be done. But many site owners? They will not be able to afford the SEO services they need. When they cannot afford to get the audit to confirm to them that yes, Google algorithms hurt your site, they will not survive.

Meanwhile, we as SEOs will have difficulties moving the needle internally when we cannot get buy-in from key players based on the idea of “being awesome.” Google will lose the ability to move those sites toward their aims. If we are not communicating Google’s needs to site owners, they will likely never hear about them. (There is a reason so many sites are still not mobile-ready!)

Is that black box worth it to Google? Perhaps. But is being obtuse and lacking in transparency truly beneficial to anyone in the long run?

It seems there are better ways to handle this than to simply direct everyone to make “awesome” sites and to read the Webmaster Guidelines. We are professionals trying to help Google as much as we are asking them to help us. It is a partnership, not an adversarial relationship.

No one is asking for trade secrets — just confirmation that Google made a change (or not) and generally what they changed.

It is like feeling really sick, going to the doctor, and he tells you, “Well you have a Fred.”

You ask the doctor, “What can I do for a case of ‘Fred?’”

He looks at you and says, “Easy! Just be awesome!” And then he walks out the door.

Well, you think, at least I have WebMD.


In the meantime, here are some ideas of how you can work with Fred and Google’s black box.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

August 22, 2017: The day the 'Hawk' Google local algorithm update swooped in

I recently reported on an algorithm update impacting the local results that happened on August 22, 2017. This was a strictly-local update, from what I can tell so far, which means that it had no impact on the non-local organic results.

What changed?

The update, which I have dubbed “Hawk,” was a change to the way the local filter works. To get some history here, Google actively filters out listings from the local results that are similar to other listings that rank already. Basically, Google picks the most relevant listing of the bunch and filters the rest. It’s very similar to what they do organically with duplicate content. (Note: Google is typically loath to confirm algorithm updates, usually only saying that it rolls out several updates every day, so these observations are based on an analysis of how local results have changed rather than on any official announcement or acknowledgment.)

The filter has existed for a long time to help ensure that multiple listings for the same company don’t monopolize the search results. In September 2016, the Possum algorithm update made a significant change to the way the filter works. Instead of just filtering out listings that shared the same phone number or website, Google started filtering out listings that were physically located near each other.

This was very problematic for businesses. It meant that if another business in your industry was in the same building as you — or even down the street from you — that could cause you to get filtered out of local search results. Yep, that means your competitors could (inadvertently) bump your listing!

On August 22, 2017, Google refined the proximity filter to make it stricter. It still appears to be filtering out businesses in the same building, but it is not filtering out as many businesses that are close by.

Who this helped

Here is an example of a business I was tracking that benefited from this update. Weber Orthodontics got filtered after the Possum algorithm update for the term “orthodontist wheaton il” due to the fact that they had a competitor down the street — 325 feet from where they were located. This competitor had a higher organic ranking and stronger relevance to that keyword, so they were included in the results, and Weber was filtered out.

Here is a before-and-after screen shot that shows how the local results changed as a result of the Hawk update; notice how Weber was completely missing from the results a few months ago.

I was able to nail down the exact date this happened because I have a robust tracking plan with BrightLocal that scans daily and takes screen shots. After studying multiple, completely unrelated cases, I was able to confirm that all cases had this same pattern on August 22.

Another example was this set of four hotels in Killeen, Texas. Previously, two of the four were filtered.

Now, following the Hawk update, all four are showing.

Who is still filtered?

Naturally, this update didn’t help everyone. Although it tightened the distance needed to filter a similar listing, it didn’t remove it completely. I’m still seeing listings that share an address or building being filtered out of local search results. I also see the filtering problem persisting for a business that is in a different building that’s around 50 feet away from a competitor.

Below, you can see that the local results for “personal injury attorney palmdale” — an example I shared in my Possum article — are still filtering out many attorney listings using the same virtual office address. (All the listings in red share the same address as the listing in green and are filtered as a result.)

Why ‘Hawk?’

The local search community settled on the name “Hawk” for this algorithm update, because hawks eat possums. This is one of the few times where I don’t see any negative outcomes as a result of this update and just wish Google hadn’t taken a year to realize the proximity filter was way too broad.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Meet the fake news of the online marketing world (that Google loves!): Review sites

A recent visit to TopSEOs.com revealed some amazing “facts” about online marketing vendors. Did you know, for example, that:

One of iProspect’s biggest clients is Circuit City (which went bankrupt in 2008)?iCrossing’s annual revenue is between $1 million and $3 million?Geary LSF (which closed in early 2016) currently has 93 employees and revenue of more than $10M. Oh, and they are the #59 best SEO agency in the US as of July 2017.My company, 3Q Digital, also has revenue between $1 million and $3 million (wrong), has three founders (all of which founded an agency we acquired and only one of which has ever been part of my agency), is located in an office we haven’t rented for four years and is apparently the #7 best mobile marketing company in Australia (if only we had an Australian client).An agency in Lehi, Utah, is ranked #1 in the US for search engine optimization, local SEO, remarketing, Facebook advertising, LinkedIn advertising, YouTube advertising, web design, web development, site audits, App Store Optimization, inbound marketing, infographic design, medical SEO, multilingual SEO, online marketing, video marketing, voice search optimization, and web marketing? They also rank highly in Canada and the UK. (I’m not providing a link here because I don’t want to help the organic results.)

So, um, yeah, it seems like a lot of the in-depth research on TopSEOs is at least 3 to 4 years old. I give them some credit, however, for updating the data they had in 2012, when they listed Google as being based in Seattle with annual revenue of $5 million to $10 million.

When top SEO rankings go to fake reviews sites

Now, you might be thinking: Why does this matter? There are plenty of junky pay-to-play review sites that exist entirely to drive leads to their sponsors. Here’s the rub: Google’s SEO algorithm apparently can’t separate the fake review sites from real content. To wit, I did a few Google searches and found TopSEOs highly ranked on numerous terms including:

#1 – Top SEO Agency#2 – Best SEO Company#3 – Top PPC Agency#4 – SEO Agency#5 – Best SEM Agency#5 – PPC Agency

And then there’s Clutch.co — which happens to rank their sponsoring service-providers in the top spot for most categories — that is also doing smashingly in the organic results:

#1 – Best SEO Agency#1 – Best SEO Company (right in front of TopSEOs)#1 – Top PPC Agency#1 – SEO Agency#1 – PPC Agency#2 – Top SEO Agency (right behind TopSEOs)#3 – Best SEM Agency

It’s one thing for an agency to tout a bunch of fake rankings on their own website, but when Google is giving top SEO rankings to fake ranking sites, it gives these sites an aura of legitimacy that could lead businesses to trust the rankings.

Indeed, the agency that appears to be paying a ton of money to TopSEOs, and has been awarded top-ranking in the majority of categories, has a 1.5 out of 5 stars on Yelp and has received 12 negative reviews (none positive) on the Better Business Bureau’s website. (Snippets from the reviews include “in the end probably created more harm than good,” “they very obviously didn’t care about their work,” and “this is the most dishonest company I’ve ever dealt with.”)

I have a pretty simple theory as to why these review sites continue to impress Google’s algorithms. Every company that “wins” a high ranking promotes the ranking on their website, which means that the fake review sites are getting dozens of links from agencies. So Google’s algorithm sees a bunch of SEM and SEO agencies linking back to a site about “top SEM agencies” and concludes that this must be the most relevant site!

This is the same “link bombing” that resulted in George Bush showing up #1 for the term “miserable failure.” Google has developed algorithms in the past to fix egregious link bombing, so it’s a bit surprising that this problem still exists.

And even if the algorithm isn’t up to snuff, Google allegedly has 10,000 human reviewers who are on the look-out for bad and irrelevant content. It’s hard to believe that none of these folks have ever come across these fake review sites and imposed a ranking penalty.

I do have to give TopSEOs credit for one thing: making me laugh. They have a page about their testing facility that is fantastic, and I recommend reading the whole thing here. To give you a taste of its awesomeness, however, just check out this paragraph about their office, which uses high-falutin’ words to describe what appears to be a thermostat, desk lamps, computers with built-in speakers, nearby restaurants, and a hospital!

Our testing facility is outfitted with temperature control systems, audio capabilities, and automated lighting solutions to provide each team member with the optimal work environment. Our infrastructure makes communication convenient while the location of the testing facility ensures ease of access to products and services which team members may require for sustenance or medical concerns.

All joking aside, I’d like to see Google take their organic results a little more seriously and clean up the “fake news” that is no doubt misleading many businesses.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Google algorithm update may be rolling out since June 25

Google seems to have been rolling out a large search algorithm update starting sometime around Sunday, June 25. Google’s John Mueller said, “We make updates all the time,” when asked specifically about the speculation around a Google update. In my opinion, there was an algorithm update to the Google search rankings over the past few days that is still rolling out.

Google technically did not confirm it, outside of John Mueller’s typical reply to any questions around an algorithm update. But based on the industry chatter that I track closely and the automated tracking tools from Mozcast, SERPMetrics, Algoroo, Advanced Web Rankings, Accuranker, RankRanger and SEMRush, among other tools, it seems there was a real Google algorithm update.

Again, Google’s PR team did not officially confirm it outside of John Mueller’s tweet.

Here are some screen shots of those popular tracking tools:

Here are some random tweets from industry SEOs:

Seeing the same site flying up in one country and down in the other.

— Dawn Anderson (@dawnieando) June 26, 2017

Experiencing a huge fluctuation in the rankings. Some keywords received a boost and some went from page 1 to 3

— Deepak Kumar (@deepak_387) June 27, 2017

We suspect the same thing. Clients from the first page got dropped. Do let us know if you get any more info.

— MegReb (@meg_reb) June 27, 2017

Check your analytics and rankings. You may be surprised to see changes in your traffic — hopefully for the better.

Postscript: John Mueller from Google clarified that his tweet above did not confirm anything and Google did not confirm there was a specific update. He said this on June 28th:

There's nothing confirmed from our side, this is all just random chatter. ¯\_(ツ)_/¯

— John ☆.o(≧▽≦)o.☆ (@JohnMu) June 28, 2017

What advanced SEOs need to know about algorithm updates

Times have changed. Gone are the days of yearly algorithm updates that would upend the search results and leave us scrambling. These days, it’s common to see ranking and traffic changes on a daily or weekly basis — and when it comes to algorithms, Google rarely even confirms updates. In fact, according to Olga Andrienko, Head of Global Marketing at SEMrush, of the 28 updates SEMrush tracked this year, only two have been confirmed by Google.

What does this mean for SEOs? Without guidance or transparency from Google, how should we react to ranking changes or possible penalties, and what should we be aware of?

Search experts at SMX Advanced last week tackled these questions in a session titled, “Dealing With Algorithm Updates: What Advanced SEOs Need To Know.” Andrienko and fellow panelists Marie Haynes (Owner, HIS Web Marketing) and Jeff Preston (Senior Manager, SEO, Disney Interactive) provided some tips and checklists to help SEOs better identify penalties, assess traffic drops and take action when needed. Let’s take a look.

It’s not you, it’s me

According to the panelists, just because you saw a big traffic or rankings loss on a day where an algorithm update hit, that doesn’t mean you were penalized. In fact, it’s probably not a penalty at all.

As both Haynes and Preston noted, there are a number of things that can lead to a sudden decrease in traffic: site redesigns, site updates, analytics adjustments, and more. When in doubt, it’s probably you, not Google.

Before assuming you were penalized, identify any changes that were made on your site. Talk to the QA team, check tech team activity, talk to the content team — whoever has the power to make updates to the site should be your first line of communication.

Speakers also noted that it’s important to know your data to be able to make a true assessment. Haynes gave us a checklist of things to look into:

Check Search Console. If you are assessed a manual penalty, you’ll see it in there.Determine which pages saw traffic drops. If you are just seeing one page being impacted, it’s not an algorithm change.Check all organic traffic data. If you were impacted by an algorithm update, you should only see an impact in Google.Look at your competitors. Did your competitors see any changes? Algorithm updates tend to target certain types of search results and industries. Take a look at competitor rankings.

At the end of the day, you may not have been hit with a penalty at all, and it’s important to look at all of the other factors that might lead to a drop in traffic first.

Tools can guide us

If you work in the SEO space, you know there are tools for just about everything, including SERP volatility. And as Andrienko pointed out, they all have a number of benefits.

But tools can only take us so far. We have to account for the fact that some industries simply fluctuate more than others, or that mobile results tend to fluctuate more than desktop. As Andrienko showed, sports results are almost always different depending on what’s happening that day. The same goes with entertainment and news.

Tools are a good way to track site performance, and while it’s always interesting to see the changes in SERPs, make sure you are looking at the big picture. Look beyond your own site to see how your overall industry is performing and being impacted. And as noted above, just because a tool shows us SERP fluctuations, it doesn’t mean you were penalized.

Be liked & be valuable

If Google can figure out which sites you like, why can’t they figure out what sites everyone likes? I loved this idea from Haynes and what it implies: Google wants to provide users with the best possible experience. It wants to give users what they are looking for, and the updates we are seeing now are geared to do that.

Case in point: Preston noted Disney removed 80,000 low-quality pages and got a boost in organic traffic. Most sites that remove content don’t see a jump in organic; however, because the majority of these pages were low-quality and receiving ~1 visit per month, they weren’t helping the site in any way.

Haynes also focused on E-A-T (Experience, Authority, Trust) and noted that if you don’t pursue these things, you are going to be outranked by competitors who do.

She also discussed the idea that certain people and/or brands have EAT for certain ideas, and Google is looking to put those things together.

Wrap-up

The speakers did a fantastic job covering what to look for, how to find information and what to do if you are penalized. But one of the key pieces of advice throughout it all was to talk your industry friends. If you are seeing changes or weird occurrences, ask others in the search space, and they’ll likely be able to point you in the right direction.

And at the end of the day, just make your site better. If you think you are doing something that might drive a penalty, stop and fix it. If you think your site can be improved, work on that. The search engines want to serve up quality, and it’s your job to give them that.

Couldn’t make SMX? Be sure to check out the presentations from each of the speakers below:

Algorithm changes and My Sites By Jeff Preston from Search Marketing Expo – SMX

Traffic Drop Assessments By Marie Haynes from Search Marketing Expo – SMX

Making Change Predictable: Tools That Track Google Volatility By Olga Andrienko from Search Marketing Expo – SMX

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

How to stop worrying about Google updates

As SEOs, we tend to obsess over changes to the organic results. It usually works like this:

You get to your computer in the morning. Ready to start work, you take a quick look at Facebook to check what you have missed. You run across someone asking if anyone saw changes last night. They’ll typically also note that there was “a lot of activity.”

“Activity” means that SEOs who follow changes to search rankings saw some fluctuations in a short period of time. If there is “a lot of activity,” that means there were large fluctuations in many websites’ rankings in a vertical or across verticals. Sometimes these results are positive, but mostly they are not. Big updates can often mean big drops in traffic.

So you quickly go check your Analytics and Search Console. Phew! The “activity” didn’t impact you — this time. But what about the next one?

This is what happens when Google rolls out large-scale changes to its search algorithms, and what is in these rollouts has been the topic of many articles, tweets and Facebook posts over the years.

What if I told you, though, that while it is very important to know what Google’s algorithms contain, you do not really need to know granular details about every update to keep your site in the black?

No-name rollouts

When former Head of Web Spam Matt Cutts was the point of communication between SEOs and Google, he would confirm updates — and either he or others in the industry would give each update a name. This was very helpful when you had to identify why your site went belly up. Knowing what the update was targeting, and why, made it much easier to diagnose the issues. However, Google does not share that information much anymore. They are much more tight-lipped about what changes have been rolled out and why.

Sure, Google will confirm the big stuff — like the last Penguin update, when it went real-time — but how many times have we seen an official announcement of a Panda update since it became part of the core ranking algorithm? The answer is none — and that was over 18 months ago.

The ‘Fred’ factor

As for all the other unidentified changes SEOs notice, but that Google will not confirm? Those have been just been given the name “Fred.”

Fred, for those who don’t know, is just a silly name that came out of an exchange between Google Webmaster Trends Analyst Gary Illyes and several SEOs on Twitter. Fred is meant to cover every “update” SEOs notice that Google does not confirm and/or name.

So, what’s an SEO or site owner to do? If your site suffers a downturn, how will you know what caused it? How do you know what to fix if Google won’t tell you what that update did? How can you make gains if you don’t know what Google wants from you? And even more importantly, how do you know how to protect your site if Google does not tell you what it is “penalizing” with its updates?

Working without a net

Today, we work in a post-update world. Google updates are rolling out all the time. According to Gary Illyes and John Mueller, these algorithms update most every day, and usually several times a day, but they don’t share that information with the community.

If they update all the time, how is it a post-update world?

Post-update world refers to a world where there is no official identifying/naming of algorithm changes, no confirmation that an update has been rolled out, and consequently, no information on when that rollout occurred. Basically, the updates they tell us about are becoming more and more infrequent. Where Matt Cutts might have told us, “Hey we are pushing Penguin today”…

… Illyes or Mueller might just say:

So, if you cannot get the information about updates and algorithm changes from Google, where do you go?

Technically, you can still go to Google to get most of that information — just more indirectly.

Falling off an analytics cliff

While Google is not telling you much about what they are doing these days with regard to algorithm updates, you still can wake up and find yourself at the bottom of an analytics cliff. When this happens, what do you do? Running to Twitter might get you some answers, but mostly you will just get confirmation that some unknown algorithm (“Fred”) likely ran.

Outside of reading others’ thoughts on the update, what can we use to determine exactly how Google is defining a quality site experience?

Understanding the Google algorithms

A few years back, Google divided up most algorithm changes between on-page and off-page. There were content and over-optimization algorithms, and there were link algorithms. The real focus of all of these, however, was spam. Google became the search market leader in part by being better than its competitors at removing irrelevant and “spammy” content from its search results pages.

But today, these algorithms cover so much more. While Google still uses algorithms to remove or demote spam, they are additionally focused on surfacing better user experiences. As far back as 2012, Matt Cutts suggested that we change SEO from “Search Engine Optimization” to “Search Experience Optimization.” About 18 months later, Google released the Page Layout Update. This update was the first to use a rendered page to assess page layout issues, and it brought algorithmic penalties with it.

What do algorithm updates ‘cover?’

Most algorithm updates address issues that fall under the following categories (note mobile and desktop are grouped here):

Link issuesTechnical problemsContent qualityUser experience

But how do we know what rules our site violated when Google does not even confirm something happened? What good are categories if I don’t know what the rules are for those categories?

Let’s take a look at how we can evaluate these areas without Google telling us much about what changes occurred.

Link issues: It’s all about Penguin

One of the most vetted areas of organic SEO is, of course, links — and Penguin is the algorithm that evaluates those links.

It could be said that Penguin was one of the harshest and most brutal algorithm updates Google had ever released. With Penguin, if a site had a very spammy link profile, Google wouldn’t just devalue their links — they would devalue their site. So it often happened that a webmaster whose site had a spammy inbound link profile would find their whole site removed from the index (or dropped so far in rankings that it may as well have been removed). Many site owners had no idea until they walked in one day to a 70+ percent drop in traffic.

The site owner then had to make fixes, remove links, do disavows and wait. And wait. And wait until Penguin updated again. The last time it refreshed, there had been a two-year gap between algorithm updates. Without the update, your site could not fully (or sometimes even partially) recover its ranking losses.

September 2016: Real-time Penguin

In September 2016, everything changed: Google made Penguin part of its core algorithm. Penguin’s data now refreshes in real time, and it no longer impacts an entire website’s rankings by default. Thus, with this update, Penguin was no longer a site killer.

When Penguin runs now, it will only devalue the links, not the site — meaning that rankings might be adjusted on query, page or section level. It would be rare to come in and check your site in the morning to find it has fallen off an analytics cliff entirely. That could happen, but if your site links are that spammy, it is much more likely you would get a manual penalty.

When real-time is not real-time

Now, “real-time Penguin” does not mean literally real-time. Google still needs to recrawl your site once the link issues have been fixed, which could take weeks, depending on how often Google crawls your site. Still, this real-time update makes it much easier to fix your link profile if you determine that links are your issue (spammy links are typically very obvious).

Remember, all sites will likely have some bad links. After all, it is not natural for a site to have a perfect backlink profile. But when bad links are comprising a significant percentage of your inbound links (let’s say around 25-30 percent), you need to start looking with a critical eye towards fixing spammy links and/or anchor text. (A general rule of thumb is if you have over 50 percent spammy links or anchor text, you most likely have a link devaluation.)

So, identifying site issues related to links is fairly straightforward. Are your links good links? Do you have over-optimized anchor text? If you have a spammy link profile, you just need to fix the link issues — get the links removed where you can, disavow the links where you can’t, and work on replacing these spammy links with good ones. Once you’ve fixed the link issues, you just have to wait.

As mentioned above, it can take up to a few weeks to see a recovery. In the meantime, you need to review the other areas of your site to see if they are in line with what Google defines as a quality site. “But I know the problem is links!” you say. Well, you might be right — but a site can receive multiple devaluations. You always want to check “all the things!”

Technical, content and user experience issues

This is where we have so much less guesswork than when we are looking at a link issue. Why? Because Google has provided webmasters with a wealth of information about what they think makes a good site. Study what is in their documentation, come as close to the Google site ideal as possible, and you can be pretty sure you are in good standing with Google.

Where do you find this information?

Following are some resources you can use to get a solid idea of what Google is looking for when it comes to a website. These resources cover everything from SEO best practices to guidelines for judging the quality of site content:

Search Engine Optimization Starter Guide — This is a basic outline of best practices for helping Google to crawl, index and understand your website content. Even if you are an experienced SEO, it never hurts to review the basics.Google Webmaster Guidelines — These are Google’s “rules of the road” for site owners and webmasters. Stay on the right side of the Webmaster Guidelines to avoid incurring a manual action.Google Quality Raters Guide — This is the guide Google gives its quality raters to help them evaluate the quality of search results — in other words, when a user clicks on your website listing from a search results page. Quality raters use this guide to determine what is and what is not a good page/site, and you can garner a lot of helpful insights from this content.Bonus: Search Engine Land’s Periodic Table of SEO Factors — This isn’t a Google resource, but it’s helpful nonetheless.

Almost anything and everything you need to know about creating a good site for Google is in these documents.

One note of caution, however: The resources above are only meant as guides and are not the be-all and end-all of SEO. For instance, if you only acquired links in the ways Google recommends, you would never have any. However, these documents will give you a good blueprint for making your site as compliant with the algorithms as possible.

If you use these as guides to help make site improvements, you can be fairly certain you will do well in Google. And furthermore, you will be fairly well protected from most negative algorithm shifts — even without knowing what algorithm is doing what today.

The secret? It is all about distance from perfect, a term coined by Ian Lurie of Portent. In an SEO context, the idea is that although we can never know exactly how Google’s algorithms work, we still do know quite a lot about Google considers to be a “perfect” site or web page — and by focusing on these elements, we can in turn improve our site performance.

So, when your site has suffered a negative downturn, consult the available resources and ask yourself, What line(s) did I cross? What line(s) did I not come close to?

If you can move your site toward the Google ideal, you can stop worrying about every algorithm update. Next time you wake up in the morning and everyone is posting about their losses, you can be pretty assured you will be able to go check your metrics, see nothing bad happened and move on with your day.

The resources listed above tell you what Google wants in the site. Read them. Study them. Know them.

Quality Rater’s Guide caveat

It is important to note that the Quality Rater’s Guide is (as it says) for Quality Raters, not search marketers. While it contains a great deal of information about how you can create a quality site, remember it is not a guide to SEO.

That being said, if you adhere to the quality guidelines contained therein, you are more likely to be shortening that distance to perfect. By understanding what Google considers to be a high- (or low-) quality page, you can create content that is sure to satisfy users and search engines alike — and avoid creating content that might lead to an algorithmic penalty.

Get busy reading!

It’s important to educate yourself on what Google is looking for in a website. And it’s a good idea to read up on the major algorithm updates throughout the search engine’s history to get an idea of what issues Google has tackled in the past, as this can provide some insight into where they might be headed next.

However, you don’t need to know what every “Fred” update did or didn’t do. The algorithms are going to target links and/or site quality. They want to eliminate spam and poor usability from their results. So make sure your site keeps its links in check and does not violate the rules listed in the documents above, and you will likely be okay.

Read them. Know them. Apply them. Review often. Repeat for future proofing and site success.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Stop oversimplifying everything!

Once upon a time, our world was simple. There was a thesis — “The Anatomy of a Large-Scale Hypertextual Web Search Engine” by Sergey Brin and Larry Page — that told us how Google worked. And while Google evolved rapidly from the concepts in that document, it still told us what we needed to know to rank highly in search.

As a community, we abused it — and many made large sums of money simply by buying links to their site. How could you expect any other result? Offer people a way to spend $2 and make $10, and guess what? Lots of people are going to sign up for that program.

But our friends at Google knew that providing the best search results would increase their market share and revenue, so they made changes continually to improve search quality and protect against attacks by spammers. A big part of what made this effort successful was obscuring the details of their ranking algorithm.

When reading the PageRank thesis was all you needed to do to learn how to formulate your SEO strategy, the world was simple. But Google has since been issued hundreds of patents, most of which have probably not been implemented and never will be. There may even be trade secret concepts for ranking factors for which patent applications have never been filed.

Yet, as search marketers, we still want to make things very simple. Let’s optimize our site for this one characteristic and we’ll get rich! In today’s world, this is no longer realistic. There is so much money to be had in search that any single factor has been thoroughly tested by many people. If there were one single factor that could be exploited for guaranteed SEO success, you would already have seen someone go public with it.

‘Lots of different signals’ contribute to rankings

Despite the fact that there is no silver bullet for obtaining high rankings, SEO professionals often look for quick fixes and easy solutions when a site’s rankings take a hit. In a recent Webmaster Central Office Hours Hangout, a participant asked Google Webmaster Trends Analyst John Mueller about improving his site content to reverse a drop in traffic that he believed to be the results of the Panda update from May of 2014.

The webmaster told Mueller that he and his team are going through the site category by category to improve the content; he wanted to know if rankings will improve category by category as well, or if there is a blanket score applied to the whole site.

Here is what Mueller said in response (emphases mine):

“For the most part, we’ve moved more and more towards understanding sections of the site better and understanding what the quality of those sections is. So if you’re … going through your site step by step, then I would expect to see … a gradual change in the way that we view your site. But, I also assume that if … you’ve had a low quality site since 2014, that’s a long time to … maintain a low quality site, and that’s something where I suspect there are lots of different signals that are … telling us that this is probably not such a great site.

(Note: Hat tip to Glenn Gabe for surfacing this.)

I want to draw your attention to the bolded part of the above comment. Doesn’t it make you wonder, what are the “lots of different signals?”

While it’s important not to over-analyze every statement by Googlers, this certainly does sound like the related signals would involve some form of cumulative user engagement metrics. However, if it were as simple as improving user engagement, it likely would not take a long time for someone impacted by a Panda penalty to recover — as soon as users started reacting to the site better, the issue would presumably fix itself quickly.

What about CTR?

Larry Kim is passionate about the possibility that Google directly uses CTR as an SEO ranking factor. By the way, do read that article. It’s a great read, as it gives you tons of tips on how to improve your CTR — which is very clearly a good thing regardless of SEO ranking impact.

That said, I don’t think Google’s algorithm is as simple as measuring CTR on a search result and moving higher CTR items higher in the SERPs. For one thing, it would be far too easy a signal to game, and many industries that are well-known for aggressive SEO testing would have pegged this as a ranking factor and already made millions of dollars on this by now. Second of all, high CTR does not speak to the quality of the page that you’ll land on. It speaks to your approach to title and meta description writing, and branding.

We also have the statements by Paul Haahr, a ranking engineer at Google, on how Google works. He gave the linked presentation at SMX West in March 2015. In it, he discusses how Google does use a variety of user engagement metrics in ranking. The upshot of it is that he said they are NOT used as a direct ranking factor, but instead, they are used in periodic quality control checks of other ranking factors that they use.

Here is a summary of what his statements imply:

    CTR, and signals like it, are NOT a direct ranking factor.Signals like content quality and links, and algorithms like Panda, Penguin, and probably hundreds of others are what they use instead (the “Core Signal Set”).Google runs a number of quality control tests on search quality. These include CTR and other direct measurements of user engagement.Based on the results of these tests, Google will adjust the Core Signal Set to improve test results.

The reason for this process is that it allows Google to run their quality control tests in a controlled environment where they are not easily subject to gaming of the algorithm, and it makes it far harder for black-hat SEOs to manipulate.

So is Larry Kim right? Or Paul Haahr? I don’t know.

Back to John Mueller’s comments for a moment

Looking back on the John Mueller statement I shared above, it strongly implies that there is some cumulative impact over time of generating “lots of different signals that are telling us that this is probably not such a great site.”

In other words, I’m guessing that if your site generates a lot of negative signals for a long time, it’s harder to recover, as you need to generate new positive signals for a sustained period of time to make up for the history that you’ve accumulated. Mueller also makes it seem like a gradated scale of some sort, where turning a site around will be “a long-term project where you’ll probably see gradual changes over time.”

However, let’s consider for a moment that the signal we are talking about might be links. Shortly after the aforementioned Office Hours Hangout, on May 11, John Mueller also tweeted out that you can get an unnatural link from a good site and a natural link from a spammy site. Of course, when you think about it, this makes complete sense.

How does this relate to the Office Hours Hangout discussion? I don’t know that it does (well, directly, that is). However, it’s entirely possible that the signals John Mueller speaks about in Office Hours are links on the web. In which case, going through and disavowing your unnatural links would likely dramatically speed up the process of recovery. But is that the case? Then why wouldn’t he have just said that? I don’t know.

But we have this seeming genuine comment from Mueller on what to expect in terms of recovery with no easily determined explanation of what signals could be driving it.

We all try to oversimplify how the Google algorithm works

As an industry, we grew up in a world where we could go read one paper, the original PageRank thesis by Sergey Brin and Larry Page, and kind of get the Google algorithm. While the initial launch of Google had already deviated significantly from this paper, we knew that links were a big thing.

This made it easy for us to be successful in Google, so much so that you could take a really crappy site and get it to rank high with little effort. Just get tons of links (in the early days, you could simply buy them), and you were all set. But in today’s world, while links still matter a great deal, there are many other factors in play. Google has a vested interest in keeping the algorithms they use vague and unclear, as this is a primary way to fight against spam.

As an industry, we need to change how we think about Google. Yet we seem to remain desperate to make the algorithms simple. “Oh, it’s this one factor that really drives things,” we want to say, but that world is gone forever. This is not a PageRank situation, where we’ll be given a single patent or paper that lays it all out, know that it’s the fundamental basis of Google’s algorithm, and then know quite simply what to do.

The second-largest market cap company on planet Earth has spent nearly two decades improving its ranking algorithm to ensure high-quality search results — and maintaining the algorithm’s integrity requires, in part, that it be too complex for spammers to easily game. That means that there aren’t going to be one or two dominating ranking factors anymore.

This is why I keep encouraging marketers to understand Google’s objectives — and to learn to thrive in an environment where the search giant keeps getting closer and closer to meeting those objectives.

We’re also approaching a highly volatile market situation, with the rise of voice search, new devices like the Amazon Echo and Google Home coming to market, and the impending rise of the personal assistants. This is a disruptive market event, and Google’s position as the number one player in search as we know it may be secure, but search as we know it may no longer be that important an activity. People are going to shift to using voice commands and a centralized personal assistant, and traditional search will be a minor feature in that world.

What this means is that Google needs its results to be as high-quality as they possibly can make them. Yet they need to keep fighting off spammers at the same time. The result? A dynamic and changing algorithm that continues to improve overall search quality as much as they can. To maintain a stranglehold on that market share, and establish a lead, if at all possible, in the world of voice search and personal assistants.

What does it mean for us?

The simple days of gaming the algorithm are gone. Instead, we have to work on a few core agenda items:

    Make our content and site experience as outstanding as we possible can.Get ready for the world of voice search and personal assistants.Plug into new technologies and channel opportunities as they become available.Promote our products and services in a highly effective manner.

In short, make sure that your products and services are in high demand. The best defense in a rapidly changing marketplace is to make sure that consumers want to buy from you. That way, if some future platform does not provide access to you, your prospective customers will let them know.

Notice, though, how this recipe has nothing to do with the algorithms of Google (or any other platform provider). Our world is just no longer that simple anymore.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.