From algo to aggro: How SEOs really feel about Google algorithm updates

As SEOs working in the weeds with our clients each day, it can sometimes be hard to truly see how major Google algorithm updates affect our industry as a whole. Sure, we can perform test after test to see how our clients are affected, but what about the poor account manager or technical SEO director who has to put in the extra work and placate potentially panicked and frustrated clients? How are they personally affected?

BrightLocal (my employer) anonymously polled 650 SEO professionals recently on this very subject, asking them a host of questions about how algorithm updates impact their workload, their client relationships and their job satisfaction. Below, I’ll go over some of the startling results from our survey, “The Human Impact of Google Algorithm Updates.”

Google update? What Google update?

First, and almost most alarmingly, 36 percent of respondents couldn’t say whether their business or their clients’ businesses have ever been impacted by a Google algorithm update. This should come as a shock — although this isn’t necessarily Day 1 SEO Stuff, it’s certainly Week 1 SEO Stuff.

The high percentage shown here suggests that either Google needs to better communicate the potential effects of an algorithm change (we can dream, right?) and/or SEOs and in-house marketers need to do more to stay on top of updates and investigate whether their clients have been affected by them.

‘And how does that make you feel?’

Of the significant 44 percent who said their business or their clients’ had been affected by algorithm changes, 26 percent say they struggle to know how to react, and 25 percent get stressed when updates happen. (Note: For this question, respondents were able to select multiple answers.) However, on the flip side, an encouraging 58 percent either don’t get worried about updates or are actually excited by the challenge.

It’s perfectly natural for different types of people at different levels of experience to have differing reactions to potentially stressful situations, but 26 percent of respondents say they don’t even know how to react. This means that all the content you put out immediately after a Google update — whether to cash in on suddenly popular “what just happened to the Google algorithm” keywords or to genuinely help SEOs serve their clients better (we’re hoping it’s the latter) — isn’t reaching everyone.

At this point in the Google updates timeline, we should all, as content creators and content readers, be better versed in learning how to react after a Google update.

The penultimate straw

For many, it seems, the camel’s back can very nearly be broken by a surprise Google update. Just over a quarter of respondents said they’d considered leaving the SEO industry because of algorithm updates but ultimately decided to stick around.

It’s worth taking a step back next time an update hits. Take a look around your agency — are your SEO staff or colleagues ready to break? It takes strong leadership and a solid bedrock of skills for an SEO agency to bounce back from a big update, so make sure your best SEOs are made of the right stuff to prepare them for the worst — and, as we’ll see now, it gets bad.

How to lose clients and alienate Google

Nearly a third of respondents who said that Google updates had had an effect on business actually lost clients as a result.

But it’s not all bad news. Twenty-six percent won clients, 23 percent saw the opportunity to grow their work with existing clients, and 29 percent of respondents noticed no change after the update. So there’s quite a lot of positivity to be found here, especially considering respondents were able to choose multiple answers (which could mean that respondents both won and lost clients because of Google updates).

What this ultimately means is that what happens after a Google update is up to you. You can’t point at the above chart and say, “Well, everyone loses clients after a Google update,” because they don’t. The range of responses shows just how much is at stake when an update hits, but it also shows the huge opportunities available to those agencies that communicate with their existing clients quickly and knowledgeably, carefully managing expectations along the way, while also keeping their eye out for businesses who have taken a beating in rankings/traffic and are looking for help.

The client-agency relationship

One final point the survey touched on was the client-agency relationship and how it can be affected by Google updates. A majority agreed that updates make clients more dependent on agencies. (Who knew it? It turns out that every time Google released an algorithm update, they were doing SEOs a favor all along!)

However, with that extra dependency comes extra scrutiny, as seen by the 31 percent of respondents who feel that Google updates lead to clients distrusting agencies. The wisest SEOs in this particular situation are the ones going into client update meetings with clear, transparent overviews of what the client’s money or their time is being spent on, and simplified (but not necessarily simple) explanations of the ramifications of the Google update.

And for the 28 percent who said that Google updates make clients consider changing agency? Well, I hope you do better next time!

What is the first thing you do when an algorithm update happens?

Before I leave you to stew on all that data and start pre-packing your next Google Update Emergency Go-Bag, here are some of the qualitative responses we received to one particular question in the survey, “What is the first thing you do when an algorithm update happens?” May these serve to remind you that whatever happens, no SEO is alone:

The data-divers

“Run ranking reports on all clients.”“Review all the sites that are affected and determine what they have in common. That gives me a starting point as to what has changed.”“Determine which high-volume pages are most impacted, then review existing SEO to try to uncover anything that might be the cause of the traffic from an on-page or technical SEO perspective.”

The researchers

“Read the posts on it to find out what happened and how to react.”“Figure out how I need to change my strategy.”“The first thing I do is research to find out what has been impacted. Next, I inform my team of what to expect from incoming client calls. Following that, I write an article for our blog to include our clients in on the updates.”“Read, read, read everything I can get my hands on.”“Read and study. Then work to fix it.”“Check forums/respected sites to find out as much information as possible.”“Get educated.”“Read as much as I can on what happened/what was affected, then find what it did to my websites/keyword rankings, then rebuild and re-conquer.”“Start reading news releases and blogs from highly respected SEO professionals to try to figure out the changes.”

The vice users

“Grab an adult beverage (or two).”“Drink coffee.”“Smoke a cigarette.”“Go for a few beers.”“Take a Xanax.”

The waiters

“Wait a few weeks while watching the SERPs.”“Nothing, I wait for the algorithm to normalize. I take a look at websites that drop, and websites that increase in rankings. I then compare and contrast my clients’ sites to those. Once I have better understanding of how the algorithm affects sites, I will adjust the strategy.”“Just ignore it for a couple weeks then make adjustments.”

The communicators

“Check for confirmation of update. Assess impact. Communicate with affected clients.”“Share the news with my team and engage them in coming up with a plan.”

 The extremes

“Prepare for the s***-storm ahead.”“Freak out.”“Cry.”

The one person who was actually positive about it

“Celebrate the new consulting opportunities that will result.”

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

6 ways ad agencies can thrive in an AI-first world

Artificial intelligence (AI) and machine learning have long been part of PPC — so why are AI and machine learning all of a sudden such hot topics? It is, in part, because exponential advances have now brought technology to the point where it can legitimately compete with the performance and precision of human account managers.

I recently covered the new roles humans should play in PPC as automation takes over. In this post, I’ll offer some ideas for what online marketing agencies should consider doing to remain successful in a world of AI-driven PPC management.

Be a master of process

According to the authors of the book “The Second Machine Age,” chess master Garry Kasparov offered an interesting insight into how humans and computers should work together after he became the first chess champion to be defeated by a computer in 1997. In matches after his loss to Deep Blue, he noticed a few things:

    A human player aided by a machine could beat a computer.When two human players were both assisted by a computer, the weaker human player with a good process could beat the stronger player with an inferior process.

The first point is covered in my previous post, and it is the foundation for why smart PPC managers will learn to collaborate with AI rather than compete against it.

The second point got me thinking about some other scenarios where the winners aren’t necessarily the most skilled. Does the world’s most successful coffee chain have the best baristas? Do the most successful hotels employ staff who innately know how to make guests happy?

No. In almost any scenario where humans are a big part of the experience, success is achieved by having a clear mission that is supported by a really strong process and tools to achieve the mission.

Hence, I believe that in the world of PPC agencies, a primary focus should be on building an amazing process and equipping the team with tools that make that process easy to follow. So as AI takes over some of the tasks in your agency, make sure your staff knows and follows the process for leveraging the technology to deliver results.

Accept that your old value proposition is toast

Consider how you convinced your existing clients to sign up with your agency. If your pitch included that you produce amazing results because you’re really good at bid management (something machines are getting really good at), you may need to tweak your positioning. You don’t want to make your main value proposition something that can be put on autopilot by anyone — and will hence become very difficult to price at a level that makes you successful.

That’s not to say that you should stop thinking about something like bid management altogether. Instead, you should offer skills that are complementary to the AI system rather than skills that compete against it.

Hal Varian, Google’s chief economist, gives the career advice to “become an indispensable complement to something that’s getting cheap and plentiful.” For example, become a data scientist because we’ll need more people to make sense of the data and to figure out how to turn new insights we get from more sophisticated AI into new strategies.

In the context of an ad agency, this makes a lot of sense. You want to be able to say you have great data scientists who can make sense of what the automated systems are doing and make solid recommendations for the next thing to test.

Determine your new value proposition

Do you know California’s largest agricultural export? I guessed wine, but the correct answer is almonds. How did this come to be? It turns out that almonds are easy to harvest mechanically; you basically have a machine that violently shakes the tree so the nuts fall down to be harvested. So farmers figured they could be more productive by using automation, and all of a sudden tomato fields across the state were turned into almond orchards.

But people want more than just almonds on their plates, so despite how automation moved an entire state’s economy in a certain direction, it also created opportunities for farmers who didn’t automate.

We can apply this analogy to paid search agencies. Thanks to advances in AI, it is a given that they will do a good job of managing bids, and it’s also assumed that this service will be cheap because technology has commoditized it.

Agencies, like farmers, can supplement their highly automatable service offerings with something that commands a higher fee. So figure out what will be your niche in things that are harder to automate. And think about why a client would want to hire you if you’re just as good as the next agency at managing bids. Figure out what additional services you are really good at that are harder to automate (for now) and can be used to win new business.

Be the best at testing because testing leads to innovation

Innovative agencies win awards, which makes it easier for them to land new clients and grow their business. But how can an agency be innovative in a world where a lot of the work is done by a handful of automated systems that produce similar results?

I believe economist Martin Weitzman’s recombinant view of innovation offers a possibility. Recombinant Innovation describes innovation as a process through which new ideas emerge as the combination of existing ideas. Thanks to better prediction systems using machine learning, it is now possible for agencies to test new ideas faster and to iterate faster. Hence, an agency that leverages machine learning for testing and has a really strong process will be able to out-innovate its competitors.

Innovation in an agency is to recombine ideas into valuable new ones. The problem with testing new ideas is that it used to take a lot of time. But thanks to technology, you can test more things more quickly, and the winning agencies will be those that are the fastest at finding new winners. And they can achieve this by prioritizing the most likely winners into the fastest process, with the best testing technology.

You need to monitor the tradeoffs between labor and technology

Business is a big optimization problem. As an agency owner, you balance labor (headcount), and capital investment (technology) to achieve outcomes with a target level of speed, quality and cost. As technology takes hold in more aspects of PPC management, knowing how to optimize the equation becomes critical.

What some advertisers fail to see is that there is no perfect technology (just as there is no perfect human employee), but if a technology gets you close enough to the desired result while freeing up your staff’s time to work on other things, that is a win.

We all hire people for our companies, even when we know that ALL humans make mistakes. But we hire the best we can because it gets us closer to our goals, even if not 100 percent of the way. So why should it be any different when we think about capital investments?

A former colleague of mine who is still at Google shared examples where advertisers told him that they would not use broad match because it resulted in some impressions for their ads on irrelevant queries. But when prodded further, they were unable to quantify the impact this had. In many cases, the additional clicks were negligible, while the time they could have saved by letting Google’s AI handle query exploration was significant.

In my view, this is a poor optimization of that account manager’s time. In exchange for a small sacrifice in targeting precision, they could have freed up billable hours worth hundreds of dollars.

Hire one extraordinary (wo)man

American philosopher Elbert Hubbard said that “one machine can do the work of fifty ordinary men. No machine can do the work of one extraordinary man.” And he was on to something. In engineering, a great engineer can do the work of 10 good engineers.

So, as more of an agency’s work gets done by machines and you need fewer humans to do repetitive work, having the smartest possible person to work on the tasks that remain will be more important than ever.

Conclusion

There’s never a boring day when working on PPC, mostly because Google pushes so many changes every year. But this year, AI is going to stir the pot and create some challenges unlike the ones we’ve been used to dealing with. Hopefully, some of the thoughts shared here will get you thinking about strategies for keeping your agency successful in a world of AI-first PPC.

Stay tuned for my next post in this series, where I’ll cover how the technology got us here and what we can automate today.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

4 things SEO professionals should do consistently

As SEO professionals, we’re expected to have a solid understanding of our trade and to be able to communicate our knowledge clearly and professionally with our clients. But I think our expectations should be set a bit higher, similar to the fiduciary responsibility that certain financial professionals are held to. This would go a long way in further improving an already amazing industry, helping us to build greater trust while better serving our clients.

Never intentionally put clients at risk

Marketing requires us to constantly evaluate risk vs. reward, and that’s especially true when it comes to search engine optimization because algorithms are constantly changing. Some of the tactics that would have been acceptable just a few years ago could get a website penalized today.

But it goes beyond algorithms changing.

I’m a proponent of white-hat SEO because it creates a sustainable foundation for success, rather than the churn-and-burn approach that is required with black-hat SEO. But every now and then, clients will insist on tactics that will eventually hurt them. In some cases, this may be because they have little to lose and much to gain; in other cases, it may be because they are simply misinformed. Either way, it’s our job as professionals to never intentionally put our clients at risk through our actions, as well as help educate them so that they don’t do something stupid on their own.

Much like the medical profession and their Hippocratic Oath, our first obligation as SEO professionals is to do no harm to our clients’ websites.

Work with absolute transparency in all matters

I was recently speaking with a potential client who was unhappy with the results from the SEO company he was working with. It didn’t take long to figure out why. When I asked what they had done for his campaign, he couldn’t answer — because they told him their techniques were proprietary.

Every truly experienced, professional SEO practitioner knows that there is no such thing as “proprietary SEO techniques” because the days of tricking the search engines are dead and gone. Modern SEO consists mostly of three components:

Technical SEO (on-site SEO).Original, high-quality content.Editorial links from relevant websites.

There are no secrets, silver bullets or magic spells, and anyone who claims otherwise is simply a con artist.

We are performing work for clients that will have a long-lasting impact on their website, so it’s their right to know exactly what we’re doing on their behalf.

Now, some people will say, “But Jeremy, if I tell them exactly what I’m doing, they might try to do it themselves!” If you fear that, then you’re simply not providing enough value in the relationship.

Clients come to us for several reasons. One is that we can see and understand things that our clients can’t. Another reason is our ability to get certain things done.

Look, I want my clients to know exactly what goes into a proper SEO campaign because once they do, they realize that they don’t have the time to do it themselves — especially when you consider that it’s not enough to simply check a box. Tasks like content development and link building require a lot of work and have to be executed with a high level of quality. Most clients are already too busy running their own business to write content or send link outreach emails, and that’s exactly why they come to us.

Speaking of transparency…

Ensure that the client owns their properties, content and data

About a year ago, a small web design agency here in Tampa closed down with little notice, and because of a mutual contact, the former owner reached out to me to help migrate their clients to their own servers.

In doing so, I stumbled upon a huge problem that I often see in our industry, and that is digital marketing agencies and web designers setting up digital assets under their own accounts rather than their clients’. Such assets include, but are not limited to:

domain registrationshosting accountsGoogle AnalyticsGoogle Search Console / Bing Webmaster Toolssocial media profilesPPC accounts

This poses a huge risk for our clients. Had this particular web designer gone out of business and simply disappeared, like many do, then his clients — dozens of small businesses — would have been forced to start their digital brands over from scratch. Some may have even been forced out of business as a result. This is a completely unacceptable practice.

Any accounts you set up for your client should be set up in their name, and they should always have full access. You can then add additional users for your team or simply log in with their credentials.

Work with specialists when necessary

One of the hallmarks of a true professional is knowing when something is outside of their expertise. When you encounter this scenario, it’s important to set ego aside and seek the assistance of a more qualified specialist.

No one is above this — in fact, I often see some of the brightest minds in our industry asking for advice from other experts who possess a different specialization.

The fact of the matter is that many of the most proficient SEO practitioners typically focus on a particular aspect of search, like Alan Bleiweiss does with forensic audits, or like Cindy Krum does with mobile SEO. By its nature, specialization in one area means weakness in other areas — and that’s OK because there are plenty of top-notch professionals in our industry we can lean on for their specific knowledge.

Obviously, that means added costs for our client in these cases, but it’s our job to convince them of the necessity in order to produce the best results possible with the least risk possible.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

PPC agencies will play these 4 roles when automation takes over

Earlier this year, I wrote about how artificial intelligence (AI) and machine learning are driving automation in PPC and then again about how Google’s latest wave of AdWords innovations is driven largely by these same technologies.

As the move towards automation accelerates, how should agencies and PPC managers update their strategy? What processes will they need to remain competitive? And what can they really expect from automation tools in the market today? I’ll cover all these topics in a series of upcoming posts, so I’d love to hear your ideas. But today, let’s begin by looking at what roles humans and agencies will play in PPC.

1. Agencies will teach machines to learn

Now that machines can learn, they certainly will surpass humans, right? The reality is that machine learning is still very dependent on humans. We program the algorithms, we provide the training data, we even manipulate the training data to help the machine get it right.

Machine learning often requires structured data to learn from, and it needs a very well-defined problem to solve. We as humans will play a role for some time to define the problem and help shape the desired outcome by manipulating how the machine can “learn.”

For now, the machines need us to be its teachers. AdWords Quality Score only works because the wisdom of the crowds provides a massive set of data about queries and clicks that the machine can use to learn from.

Tesla’s autopilot works because thousands of drivers control their cars manually through tricky situations. Because they’re all networked, this helps the next Tesla better drive itself through that same spot.

In PPC, what we have learned from years of manually managing campaigns can be the basis for teaching computers how to respond in similar situations.

Teachers can’t teach everything, so a large part of what they do is help students ask better questions. As teachers to the computers, we should allow ourselves to ask more questions, because synthetic intellect doesn’t have the same human constraints for how quickly it can find answers.

Take Quality Score, for example — it is a machine learning system that can analyze hundreds of factors related to a search and find patterns of things that have a meaningful impact on CTR. Because it can analyze data so much faster, we can feed it seemingly random and unconnected data and let it tell us if this makes a difference.

Here’s a crazy question we once asked the Quality Score system: Does the lunar cycle impact CTR? While the answer isn’t what’s important (no, there was no correlation), what is important is that we were able to ask entirely new questions and quickly get an answer that helped make the system better.

But we should also prioritize the questions we ask based on human intuition. We don’t want to waste machine power by asking everything when we already know with a high probability that some answers won’t help us improve. Consider the following example: Ask Google Maps to calculate the best route from San Francisco to New York. Calculating every possible backroad will take a long time, and considering that we know highways tend to be faster than local roads, that calculation will almost certainly not yield a better result — so we can safely ignore that question.

2. Agencies will provide the creativity machines lack

The biggest value of an agency will be the ability of its employees to work collaboratively with automation.

Chess grandmaster Garry Kasparov notes that when it comes to chess, teams of humans assisted by machines dominate even the strongest computers. In a 2005 experiment, Playchess.com launched a chess tournament in which participants could play in teams with other players and/or computers. According to Kasparov:

The chess machine Hydra, which is a chess-specific supercomputer like Deep Blue, was no match for a strong human player using a relatively weak laptop. Human strategic guidance combined with the tactical acuity of a computer was overwhelming.

Humans are still good at creative strategy — putting old ideas together in new ways and testing the results. The reason we don’t have Google’s computers writing all the ads for AdWords is that they all would end up looking the same — and then they would stop evolving because the machine would no longer have any variations to test.

Evolutionary algorithms, a subset of AI, are based on biological evolution, and they need access to variations to work well. And while they can create their own mutations, humans often still know the right shortcuts to come up with better ideas.

An advertiser on Facebook once submitted an ad that was a static image that shook a bit. This had a far better CTR than the same ad when it was completely static. It’s kind of a silly way to produce better CTR, but it’s a great example of humans trying something new that the machine probably wouldn’t have thought of because nobody had done this before inside the realm of the data it had access to.

3. Agencies will be the pilot who averts disaster

Self-driving cars are not “driverless” cars because there’s still a human behind the wheel to monitor the machine. That makes sense because not killing its passengers or others on the road is valuable enough to deserve some human resources.

In PPC, we’re fortunately not dealing with life-or-death scenarios; but we can still put a pilot in place to monitor the most important areas of automation. The trick is figuring out the 80/20 rule and saving the human involvement for the automations with the biggest potential impact.

I once audited an account that had completely tanked because the bid automation had correctly reduced bids after the launch of a terribly performing landing page. But while the landing page was quickly fixed by humans, nobody remembered to reset the bids, and the account spent months with subpar performance because its best keywords were lingering on page two of the search results.

The problem with many systems built today is that they have narrow goals that can fail due to self-reinforcing feedback loops that can cause a downward spiral:

bad performance →  bid down a bit → even worse performance → bid down some more → doom!

We can also look beyond what our own automations are doing to find weaknesses to exploit in our competitors’ algorithms. Remember that many automations are doing tasks that are well-defined, and this makes them predictable. For example, I once had to cross four lanes of traffic on my bike and was going to wait to let a car pass me first. But when I noticed it was a Google self-driving car, I went for the turn anyway because I knew the car had perfect vision and was programmed not to hit bicyclists. And since I’m sharing this story, things went well for me in that scenario.

Sometimes, we can learn from what the machine does. Lee Sedol, the world-champion Go player who was beaten by DeepMind’s AlphaGo computer, became a better player from the experience of losing to a machine. He, as well as many others watching the game, were perplexed by move 37 that the computer made. It was simply not a move any human would have played. But it was the move that set the computer up for the win, and now humans have added it to their own repertoire.

And sometimes your job as copilot is to see something that’s not there but that should have been. The book “How Not To Be Wrong” by Jordan Ellenberg tells the story of mathematician Abram Wald, who figured out what part of an airplane should be made stronger to resist being shot down by enemy aircraft during World War II. The data from planes that returned with bullet holes showed that there were more bullet holes in the fuel system than the engine. Scientists concluded that they should re-enforce the fuel system. But Wald argued that planes that were hit in the engine probably crashed and never returned, and this skewed the data.

Let’s put that into a PPC example. When you look at what leads to a conversion because you want to do more of that, maybe you should also ask what doesn’t lead to a conversion and do less of that. For example, high shipping fees may tank your conversion rate, but you wouldn’t find this out if you asked the wrong question.

4. Agencies will have the empathy machines lack

Even when computers will be doing every part of PPC management, they still won’t have the same human connection that you have with your clients. Understanding the nuances of your client’s business (which will help you come up with new ideas to test), understanding their fears about PPC, understanding their frustrations with the last account manager and so on. All this will help you have a more productive relationship with them.

One surprising profession that is leveraging AI is medical doctors. They simply can’t read as much of the existing research as Watson, so IBM’s supercomputer can be a magnificent diagnostician. But Watson may not be able to explain conditions to a patient, and it certainly will not have the empathy of a human when sharing potentially devastating news. There is still a place for doctors even when they have a supercomputer to help them.

And as PPC experts, a large part of our role will be to know which expert automations to test in an account. For bid management alone, there is an overwhelming number of options, ranging from Google’s free Portfolio Bid Strategies to upstart bid management companies that charge thousands of dollars for the promise of a slightly better result. Knowing what is available, what is worth testing and how to calculate the trade-offs is certain to be a large part of the value agencies provide.

Conclusion

Automation is taking over a lot of the tasks humans have historically done in PPC; but as this shift continues, there will be plenty of new opportunities for PPC experts and agencies to provide value to their clients.

Next time, I’ll cover new strategies and processes that will help bridge the gap between humans and artificially intelligent PPC machines.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Looking to get out of the SEO business? Good!

It’s not your imagination — there’s no denying that SEO has become increasingly difficult over the last few years.

A recent thread on WebmasterWorld highlighted this fact when a poster lamented the challenges he has faced adapting to changes in the SEO industry, and he said that he was considering leaving the industry entirely. This prompted Barry Schwartz to share a poll collecting feedback on the topic from a wider pool, and I was initially quite surprised to see that a large percentage of respondents shared the same concern as the original poster.

In hindsight, I shouldn’t have been surprised at all. I can definitely understand the frustration. You build a business based on what you think the rules are, and almost immediately, they seem to change.

Isn’t that the nature of every industry, though? Amazon, Uber and Netflix all shook their respective industries in seismic ways. And as that happened, we heard the same cries that things are getting too difficult, that things just aren’t fair, and that people can’t make the same kind of money doing work the same way they used to.

But is that really such a bad thing?

How change can lead to improvements

Change is good for every industry, especially when it forces everyone to up their game. I’ve been in the digital marketing industry since long before Google dominated search, and I’ve seen a lot of changes to our industry during that time — some good and some bad.

I’ve watched search engines rank web pages based on nothing more than crude and easily gameable signals, like meta tags, keyword density and title tags. Later, I watched a clunky algorithm evolve from basic and easy to manipulate into something highly refined and effective. Along the way, I’ve watched many tactics come and go — and wave after wave of SEO practitioners enter, then retreat, from the industry, moving on to what they saw as greener pastures.

Maybe you’re brand-new to the industry, so you don’t have much to compare it to. Or maybe, like me and some of the other veterans of the industry, you’ve experienced every iteration of search engine optimization firsthand and have weathered the storms.

For example, I remember when link building was as simple as just getting more keyword-rich anchor text links than your competitors, which predictably resulted in the SEO equivalent of a nuclear arms race. Marketers took every opportunity to create links in any way they could.

I wasn’t immune to this thinking. In fact, at one point, I had built a massive network consisting of hundreds of websites that published a constant stream of user-generated, spun content, all with the sole purpose of linking to the other websites I owned. This network added no real value and existed solely for the purpose of manipulating organic ranking.

I had managed to fly under the radar with this tactic for many years, but eventually, Google’s Panda and Penguin algorithms destroyed it by penalizing websites that published low-quality content and/or participated in manipulative linking schemes. This destroyed both sides of the manipulative linking equation.

The fallout was brutal for many SEO practitioners and even worse for their clients, who often didn’t know what tactics were being used on their behalf. Websites that had previously benefited from these tactics now suffered aggressive penalties and were often removed from the search results entirely. Often, link-based penalties lasted for years, even after fixing the issues, which predictably destroyed many businesses.

Today, we’re living in an entirely different world. While they’re still being sold, the cheap and easy tactics of the past no longer work. This means that as SEO professionals, we have to work significantly harder than ever before. I know that might sound like a bad thing, but I believe it can help the industry.

Much like the housing collapse of 2008, which forced a lot of below-average realtors out of the real estate industry, this newest evolution of the SEO industry will hopefully purge many of the SEO practitioners who are still trying to peddle ineffective and/or dangerous tactics.

As the algorithms become more effective at differentiating artificial attempts to manipulate ranking from legitimate ranking signals, and they get better at understanding or even predicting what a searcher needs, we will see a lot of people — people who didn’t belong here in the first place — finally leave the industry. This means only the truly passionate will stick around.

SEO isn’t dead, but your tactics might be

There will be a lot more of “the sky is falling” and “SEO is dead” nonsense proclaimed by those who rely on tricks and are incapable of delivering any real value. But those of you who have been in the game for awhile have heard it all before. In fact, we see this collective panic play out nearly every time there is a major update, going as far back as the infamous “Florida” update of 2003.

After Panda, we heard a lot of SEOs asking, “How can we possibly compete when we have to hire real writers to write well-written and longer articles?” While some could no longer compete when churning out a bunch of 350-word, poorly written articles each month no longer moved the needle, those who focused on producing lots of unique, thoroughly researched and well-written content reaped massive rewards.

Penguin created a similar situation when it destroyed manipulative link building. Once tactics like guest posting at scale, directory submissions and paid links were effectively eliminated, many stopped offering link-building services entirely because earning real links takes a lot of work.

Today, we’re approaching a similar situation — the next evolution, if you will — as artificial intelligence is playing an increasingly larger role in Google’s algorithm. SEO practitioners who have relied on clever tricks will be forced to either adapt or exit the industry, while those who focus on producing quality content and earning legitimate links will become the new standard.

Does that mean some SEO practitioners will be forced out of the industry? Absolutely. It also means that some SEO agencies will go out of business. And I’m completely fine with all of that because it means that the industry as a whole will be pushed to a higher standard, benefiting professionals in the industry, search engines, searchers, and most importantly, our clients.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Has AI changed the SEO industry for better or worse?

With Google turning to artificial intelligence to power its flagship search engine business, has the SEO industry been left in the dust? The old ways of testing and measuring are becoming antiquated, and industry insiders are scrambling to understand something new — something which is more advanced than their backgrounds typically permit.

The fact is, even Google engineers are having a hard time explaining how Google works anymore. With this in mind, is artificial intelligence changing the SEO industry for better or worse? And has Google’s once-understood algorithm become a “runaway algorithm?”

Who was in the driver’s seat?

The old days of Google were much simpler times. Artificial intelligence may have existed back then, but it was used for very narrow issues, like spam filters on Gmail. Google engineers spent most of their time writing preprogrammed “rules” that worked to continuously close the loopholes in their search engine — loopholes that let brands, with the help of SEO professionals, take advantage of a static set of rules that could be identified and then exploited.

However, this struck at the heart of Google’s primary business model: the pay-per-click (PPC) ad business. The easier it was to rank “organically” (in Google’s natural, unpaid rankings), the fewer paid ads were sold. These two distinctly different parts of their search engine have been, and will always be, at odds with one another.

If you doubt that Google sees its primary business as selling ads on its search engine, you haven’t been watching Google over the past few decades. In fact, almost 20 years after it started, Google’s primary business was still PPC. In 2016, PPC revenues still represented 89 percent of its total revenues.

At first glance, it would stand to reason that Google should do everything it can to make its search results both user-friendly and maintainable. I want to focus on this last part — having a code base that is well documented enough (at least, internally within Google) so that it can be explained to the public, as a textbook of how websites should be structured and how professionals should interact with its search engine.

Going up the hill

Throughout the better part of Google’s history, the company has made efforts to ensure that brands and webmasters understood what was expected of them. In fact, they even had a liaison to the search engine optimization (SEO) world, and his name was Matt Cutts, the head of Google’s Webspam Team.

Cutts would go around the SEO conference circuit and often be the keynote or featured session speaker. Any time Google was changing its algorithms or pushing a new update to its search engine, Cutts would be there to explain what that meant for webmasters.

It was quite the spectacle. In one room, you typically had hundreds of SEOs who were attacking every loophole they could find, every slim advantage they could get their hands on. In the very same room, you had Cutts explaining why those techniques were not going to work in the future and what Google actually recommended.

As time when loopholes were closed, Cutts became one of the only sources of hope for SEOs. Google was becoming more sophisticated than ever, and with very few loopholes left to exploit, Cutts’s speaking engagements became crucial for SEOs to review and dissect.

The ‘uh-oh’ moment

And then, the faucet of information slowed to a trickle. Cutts’ speaking engagements became rarer, and his guidelines became more generic. Finally, in 2014, Cutts took a leave from Google. This was a shock to insiders who had built an entire revenue model off of selling access to this information.

Then, the worst news for SEOs: He was being replaced by an unnamed Googler. Why unnamed? Because the role of spokesperson was being phased out. No longer would Google be explaining what brands should be doing with each new update of its search engine.

The more convoluted its search engine algorithms were, the more PPC ads Google sold. As a result of this shift, Google capitalized immensely on PPC ad revenue. It even created “Learn with Google,” a gleaming classroom where SEO conference attendees could learn how to maximize PPC spend.

An article by Search Engine Land columnist Kristine Schachinger about the lack of information on a major algorithmic update, and Google’s flippant response by interim spokesman Gary Illyes, had all of the SEO industry’s frustration wrapped up in a nutshell. What was going on?

Removing the brakes — the switch to an AI-powered search engine

At the same time, Google was experimenting with new machine learning techniques to automate much of the updating process to its search engine. Google’s methodology has always been to automate as much of its technology as it could, and its core search engine was no different.

The pace of Google’s search engine switch to artificial intelligence caught many off-guard. This wasn’t like the 15 years of manual algorithm updates to its index. This felt like a tornado had swept in — and within a few years, it changed the landscape of SEO forever.

The rules were no longer in some blog or speech by Matt Cutts. Here stood a breathtaking question: Were the rules even written down at Google anymore?

Much of the search engine algorithms and their weightings were now controlled by a continuously updating machine-learning system that changed its weightings from one keyword to the next. Marcus Tober, CTO of SearchMetrics, said that “it’s very likely that even Google Engineers don’t know the exact composition of their highly complex algorithm.

The runaway algorithm

Remember Google’s primary revenue stream? PPC represents almost 90 percent of its business. Once you know that, the rest of the story makes sense.

Did Google know beforehand that the switch to an AI-powered search engine would lead to a system that couldn’t be directly explained? Was it a coincidence that Cutts left the spotlight in 2014, and that the position never really came back? Was it that Google didn’t want to explain things to brands anymore, or that they couldn’t?

By 2017, Google CEO Sundar Pichai began to comment publicly on Google’s foray into artificial intelligence. Bob Griffin, CEO of Ayasdi, wrote recently that Pichai made it clear that there should be no abdication of responsibility associated with intelligent technologies. In other words, there should be no excuse like “The machine did x.”

Griffin put it clearly:

Understanding what the machine is doing is paramount. Transparency is knowing what algorithm was used, which parameters were used in the algorithm and, even, why. Justification is an understanding of what it did, and why in a way that you can explain to a reporter, shareholder, congressional committee or regulator. The difference is material and goes beyond some vague promise of explainable AI.

But Google’s own search engineers were seemingly unable to explain how their own search engine worked anymore. This discrepancy had gotten so bad that in late 2017, Google hired longtime SEO journalist Danny Sullivan in an attempt to reestablish its image of transparency.

But why such a move away from transparency in the first place? Could it be that the move to artificial intelligence — something that went way over the heads of even the most experienced digital marketing executives, was the perfect cover? Was Google simply throwing its proverbial hands up in the air and saying, “It’s just too hard to explain?” Or was Google just caught up in the transition to AI, trying to find a way to explain things like Matt Cutts used to do?

Regardless of Sullivan’s hire, the true revenue drivers meant that this wasn’t a top priority. Google had solved some of the most challenging technical problems in history, and they could easily have attempted to define these new technical challenges for brands, but it simply wasn’t their focus.

And, not surprisingly, after a few years of silence, most of the old guard of SEO had accepted that the faucet of true transparent communication with Google was over, never to return again.

Everyone is an artificial intelligence expert

Most SEO experts’ backgrounds do not lend themselves very well to understanding this new type of Google search. Why? Most SEO professionals and digital marketing consultants have a marketing background, not a technical background.

When asked “How is AI changing Google?,” most answers from industry thought leaders have been generic. AI really hasn’t changed much. Effective SEO still requires the same strategies you’ve pursued in the past. In some cases, responses simply had nothing to do with AI in the first place.

Many SEO professionals, who know absolutely nothing about how AI works, have been quick to deflect any questions about it. And since very few in the industry had an AI background, the term “artificial intelligence” became almost something entirely different — just another marketing slogan, rather than an actual technology. And so some SEO and digital marketing companies even began pinning themselves as the new “Artificial Intelligence” solution.

The runaway truck ramp?

As with all industries, whenever there’s a huge shift in technology, there tends to be a changing of the guard. There are a number of highly trained engineers that are beginning to make the SEO industry their home, and these more technologically savvy folks are starting to speak out.

And, for every false claim of AI, there are new AI technologies that are starting to become mainstream. And these are not your typical SEO tools and rank trackers.

Competitive industries are now investing heavily in things like genetic algorithms, particle swarm optimization and new approaches that enable advanced SEO teams to model exactly what Google’s RankBrain is attempting to do in each search engine environment.

At the forefront of these technologies is industry veteran and Carnegie Mellon alumni Scott Stouffer, founder and CTO of MarketBrew.com, who chose to create and patent a statistical search engine modeling tool, based on AI technologies, rather than pursuing a position at Google.

Now, 11 years into building his company, Stouffer has said:

There are a number of reasons why search engine modeling technology, after all these years, is just now becoming so sought-after. For one, Google is now constantly changing its algorithms, from one search query to the next. It doesn’t take a rocket scientist to know that this doesn’t bode well for SEO tools that run off of a static set of pre-programmed rules.

On the flipside, these new search engine models can actually be used to identify what the changes are statistically, to learn the behavior and characteristics of each search engine environment. The models can then be used to review why your rankings shifted: was it on-page, off-page, or a mixture of both? Make an optimization on your site, and rerun the model. You can instantly see if that change will statistically be a positive or negative move.

I asked Stouffer to give me a concrete example. Let’s say you see a major shift in rankings for a particular search result. These search engine modeling tools start with what Stouffer coins as a “standard model.” (Think of this as a generic search engine that has been regression-tested to be a “best fit” with adjustable weightings for each algorithmic family.) This standard model is then run through a process called Particle Swarm Optimization, which locates a stable mixture of algorithmic weightings that will produce similar search results to the real thing.

Here’s the catch: If you do this before and after each algorithmic shift, you can measure the settings on the models between the two. Stouffer says the SEO teams that invest in Market Brew technology do this to determine what Google has done with its algorithm: For instance, did it put more emphasis on the title tags, backlinks, structured data and so on?

Suffice it to say, there are some really smart people in this industry who are quickly returning the runaway algorithm back to the road.

Chris Dreyer of Rankings.io put it best:

I envision SEO becoming far more technical than it is today.  If you think about it, in the beginning, it was super easy to rank well in search.  The tactics were extremely straight forward (i.e. keywords in a meta tag, any link placed anywhere from any other website helped, etc.). Fast forward just a decade and SEO has already become much more advanced because search algorithms have become more advanced.  As search engines move closer to the realistic human analysis of websites (and beyond), SEOs will have to adapt. We will have to understand how AI works in order to optimize sites to rank well.

As far as Google goes, the hiring of Sullivan should be a very interesting twist to follow. Will Google try to reconcile the highly technical nature of its new AI-based search engine, or will it be more of the same: generic information intended on keeping these new technologists at bay, and keeping Google’s top revenue source safe?

Can these new search engine modeling technologies usher in a new understanding of Google? Will the old guard of SEO embrace these new technologies, or is there a seismic shift underway, led by engineers and data scientists, not marketers?

The next decade will certainly be an interesting one for SEO.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Web.com buys Acquisio for machine learning, PPC management capabilities

Acquisio, which provides SEM campaign management tools for enterprises and small businesses, was acquired by Web.com. Web.com has many thousands of small business customers, who buy web hosting and a range of digital marketing services, including search, social and e-commerce.

The terms of the transaction, announced Wednesday, were not disclosed.

Montreal-based Acquisio has invested heavily in developing machine learning and automation capabilities over the past several years. Web.com, based in Jacksonville, Florida, said in an email that it will be broadly applying those tools to its PPC, e-commerce and social media marketing over time.

Acquisio CEO Marc Poirier, who co-founded the company 10 years ago with CTO Richard Couture, said in a blog post that nothing would change for current customers of the platform.

The company has detailed how machine learning can greatly simplify campaign management, improve performance and enable greater scale with fewer people.

In a study (PDF) involving 30,000 of its customers, Acquisio said that machine-learning-managed PPC accounts saw improved metrics across the board, including lower CPCs and higher conversions:

Conversions increased by 71 percent on average.CPCs decreased by 7 percent on average.CTR increased by 15 percent on average.64 percent of accounts saw a lower CPA.

Acquisio added that machine learning-supported accounts also experienced lower churn on the platform than those not using machine learning. This was especially true for lower-budget accounts (sub $500 per month), where machine learning was able to increase lifetime value.

These outcomes are what likely got Web.com’s attention.

Mobile-first updates from SMX East

As every SEO knows, the rise of mobile searches has prompted Google to prioritize mobile signals in determining search results. To that end, the search giant is in the slow-going process of rolling out its mobile-first index, which is expected to be fully implemented sometime next year.

In the meantime, getting sites ready is a high-priority item on SEOs’ to-do lists, which is why the topic was addressed at this week’s SMX East conference in a panel discussion titled, “SEO For Google’s Mobile-First Index & Mobile-Friendly World.” The speakers included Leslie To, director of SEO for 3Q Digital; Ashley Berman Hale, director of SEO at Local SEO Guide; and Gary Illyes, webmaster trends analyst for Google.

Today’s post will cover the key points presented in this panel.

Leslie To: Is it the year of mobile yet?

Leslie To breaks down the process of preparing for the mobile-first index into two major categories:

    Configuration-agnostic auditingConfiguration-based auditing

Configuration-based auditing would involve those things you need to do that are specific to your mobile configuration (whether that’s a mobile subdomain, dynamic serving or responsive web design).

Configuration-agnostic auditing, on the other hand, involves items you need to address regardless of your mobile configuration, and this is what To covered first.

Configuration-agnostic auditing

Let’s start with a summary look at what matters regardless of mobile configuration:

Tips:

Use HTML for rich media and video content, and use the video element to download and decode the content.Avoid interstitials. If you want to promote your app or email list, use banners rather than full-screen overlays or interstitials. Users don’t like them, and neither does Google.Consistently test your global navigation and mine internal search data to refine that navigation (based on what you see users aren’t finding). Further, remember that mega menus don’t always work well on mobile. Simply put, don’t overwhelm users with menu options when you have limited screen real estate.Do allow content and media to scale to fill device screen size, as that provides a good user experience. To help with this, stay away from absolute declarations in your CSS.Do allow all font sizes to scale, and use 16px as your base font size. Don’t require users to zoom to read, interact with or consume content. No one likes to do that.Make your tap targets at least 48 pixels wide to make them easy to hit. In addition, space your tap targets 32 pixels (or more) apart. Don’t require users to zoom to tap buttons, links or form fields.Allow common gesture features on your e-commerce site, especially pinch/double-tap to zoom. Don’t use low-resolution images that become pixelated when you zoom.Configure internal site search to make content easier to find, and actively harvest site search queries to learn more about what users are looking for on your site so that you can make navigation, layout and content improvements over time.Enable contextual keyboards that change based on required input types. Using one standard keyboard layout for all input can be difficult for users to deal with. Don’t assume the limitations of physical keywords. For example, if you’re looking for someone to enter a domain name or email address, have a “key” that they can tap to enter “.com” — these types of contextual features will save them time.Make it easy for users to convert, whether it’s via a form fill, a phone call or your shopping cart. Enable click-to-call by wrapping phone numbers with telephone schema. Don’t require more than three clicks to complete a conversion.Implement all the basics of page speed. This means things like enabling gzip compression, leveraging browser caching and getting server response time under 2oo milliseconds.Don’t use render-blocking JavaScript, especially for external scripts. Don’t use inline CSS attributes and/or a large CSS file.Don’t make the language on your site too complex. Readability is a big concern (and not just for mobile sites). To give you some perspective, here is a look at how well US adults read:

Leverage readability indexes, such as:

Flesch reading easeFlesch-Kincaid grade levelGunning Fog indexSMOG Readability formula

You can get readability measurements for your content within Microsoft Word. There are two paths to navigate to it, shown below:

Configuration-dependent auditing: Mobile subdomains

If you’re using a mobile subdomain, you will need to implement bidirectional linking, with a rel=alternate tag on your desktop pointing to your mobile site, and a rel=canonical tag pointing from your mobile site back to your desktop site. These are sometimes called switchboard tags.

A common question that many people ask is whether or not Google will want publishers to reverse the direction of those tags with the advent of the mobile-first index. To date, Google’s answer to that has been no, that this is not necessary. They will simply assume the reverse. From Google’s perspective, if they tried to get everyone to switch them, a certain amount of chaos would be likely to result.

Do minimize cross-linking, so that your default links in the mobile experience should be to other pages in the mobile experience. But you should also provide the alternative desktop experience for users who want it. One benefit is that you can monitor clicks and, if there are lots of them, it may indicate problems in your mobile experience that you need to debug.

Do say no to blanket redirects, and try to make them all one-to-one. If you have no corresponding mobile content, leave users on the desktop page.

Configuration dependent auditing: Dynamic serving

For those using dynamic serving, you will need to implement the Vary HTTP header. This will help prevent problems with users being served the wrong versions of your pages due to ISP-caching. Without this header, ISP caching may cause mobile users to get your desktop page, and vice versa.

Watch out for, and avoid, unintended content differentiation between desktop and mobile because both sites are maintained differently.

Configuration dependent auditing: Responsive

With responsive sites, make sure you’re not blocking CSS or JavaScript files from being crawled. Check for the Meta viewport tag, as it gives directions on dimensions and scaling:

Width-device-width: matches content to the physical width of the device.Initial-scale: initial zoom when visiting a page.User-scale: allows for zooming (values are “yes” and “no”).

Use a comma to separate attributes so that older browsers can parse different attributes.

Do make sure that images and videos are also responsive, but don’t allow video to scale beyond the viewport size.

Last but not least, don’t base breakpoints on specific devices. Leverage Google Analytics’ Device Report to determine whether your breakpoints are properly serving your customers most of the time.

View Leslie To’s full presentation here:

Is It the Year of Mobile Yet? By Leslie To from Search Marketing Expo – SMX

Ashley Berman Hale: Mobile Friendly IRL, Beyond Best Practices

Ashley’s focus is on how you deal with the problem if you can’t get the budget or approval to proceed with making a site mobile-friendly.

When trying to get buy-in from stakeholders, Berman Hale suggests leaning on Google documentation and sharing relevant case studies. She also suggests showing desktop vs. mobile traffic over time — even in industries that are slow in moving to mobile, your analytics data is highly likely to show a strong trend in favor of mobile over time. Related to this is the idea of looking at competitor sites in SEMrush and showing their mobile traffic over time.

For some businesses, the issue may be that they only have a small budget. If that’s your situation, consider starting small. For example, you can break down your mobile friendliness action items into more manageable parts, including:

by site section.by product.by customer.by element.

Another practical tip is to focus on getting people on board one at a time. These kinds of approaches can help you build momentum in a positive way.

In other cases, the challenge might be that the code is a hot mess, and everyone is afraid to touch anything. The incremental approach can work well here, too. For example, you can:

compress your images.figure out how to strip some CSS.implement AMP on just a few elements.

Or perhaps your role is such that you only have control over the content on the site, and not the coding side of things. You can still make a difference. You can accomplish this by thoroughly understanding the intent of people who are reading your content on mobile and making it easy for them to find what they want.

This starts with upfront research, including your keyword research. Use this to help you understand the likely user intents, and then form your content around those concepts. Structure your content to make it easy to find, and create snackable, modular elements. In addition, modify your metadata and markup to communicate what users will get by engaging with your content.

You may have people in your business who care only about brick-and-mortar sales. But local search is typically a huge driver for that, and local search often is mobile search.

The key to unraveling this is learning how to track the progression from local searches to your site and business. Setting this up can help you get what you need to show people that local (and mobile) is critical to your business.

Or, if you’re in the right business, you may be able to call in legal. Your industry may have accessibility requirements, and a solid mobile experience may simply be something that you’re required to do.

Lastly, you should always pick your battles and “choose what hill to die on.” Make sure you are making steady progress over time; the path to maximum mobile-friendliness is definitely a marathon, and not a sprint.

View Ashley Berman Hale’s full presentation here:

Mobile Friendly IRL: Beyond Best Practices By Ashely Berman Hale from Search Marketing Expo – SMX

Gary Illyes: Google’s perspective

Illyes explains that, traditionally, the Google index is based on crawls of desktop content. However, the problem Google has had is that on many sites, the desktop site would have more content on its pages than the corresponding mobile pages. This was leading to problems in search because Google would return pages to mobile users based on the content they found on the desktop pages, but the users would then get served the mobile page and the content wasn’t there.

This created frustration with the quality of Google’s results, and this ended up driving the idea of switching to a mobile-first index. What this means is that Google will crawl mobile sites and base their search index off of the content they find from that crawl.

Illyes’ message on this is: “Don’t freak out.” Google is approaching this very carefully, and they don’t yet know when a full mobile-first index will go into effect. They started experimenting with it two years ago, and it did not go well at all.

Currently, they have moved a small number of sites into a mobile-first index, and they have been monitoring those to make sure they’re not being hurt in terms of traffic and ranking as a result.

Eric’s note: Google has to be very careful about these types of changes. While they may be desirable at some level, searchers often have pretty specific things they want and need, including specific brands, and if they’ve been artificially demoted, this will also result in user frustration. This is the same reason that things like HTTPS and page speed are such weak ranking factors.

Illyes next notes that if your site is responsive, you’re good to go! But many of the sites that have other mobile configurations are not good to go.

Common issues with mobile sites are:

Some of the content and links from the desktop site may not be present.Rel=annotations may not be there (e.g., hreflang).Structured data may be missing.Some of the media and images may be missing.

Illyes then shared the example of one site that did not move over their hreflang tags, and they lost 50 percent of their traffic. This is exactly the type of thing that Google wants to avoid.

Here are the things you should do to prepare for the mobile-first index:

    If your site is responsive, you’re already ready to go.Make sure your mobile pages have all the same videos and images as your matching desktop pages.Make sure your mobile site has all the content and all the links that show up on the matching desktop pages.Make sure to implement hreflang tags on the mobile pages.Make sure to carry over the structured data from your desktop pages.

Last, but not least, don’t panic!

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Search Engine Land Awards Gala spotlights top performers in the SEO & SEM industry

The SEO and SEM industry came out in full force at last night’s third annual Search Engine Land Awards Gala in New York City.

Created by Search Engine Land and held in conjunction with this week’s SMX East Conference, the event recognized top performers across 24 different award categories, including Best B2B SEM Initiative, Best SEO Initiative, Best Search Marketer of the Year and Search Agency of the Year.

“It’s my third year as a judge, and the industry continues to blow us away with the incredible work happening in search marketing,” said Search Engine Land Associate Editor Ginny Marvin, who emceed the event along with Search Engine Land Editor in Chief Michelle Robbins.

“What struck me about all the entries was not just the comprehensiveness of the campaigns,” said Robbins, “but the creativity and unique approaches to problem-solving demonstrated in the work.”

In addition to recognizing the best and brightest in the industry, the event also benefited the non-profit organization Women Who Code, which received a portion of proceeds as a donation.

“We selected this charity because their work is centered around providing resources and opportunities for women to excel in technology careers,” said Robbins, “As a woman who’s been coding for 20 years now, at a company 71 percent powered by women, we are thrilled to be able to provide this assistance to bring more peers into the fold.”

Best dress man at the #LandyAwards with some really talented women from #smx @sengineland @MichelleRobbins @GinnyMarvin @elisabethos pic.twitter.com/uZv5srKNWB

— Barry Schwartz (@rustybrick) October 25, 2017

Event sponsors included Google, Acronym Media and Stone Temple. Derrick Djang, head of event marketing for Google, also took the stage to say a few words about last night’s winners.

“In my role at Google, I see a lot of events. It’s literally my job,” said Djang, “But the ones I’ve come to love the most are the ones like the Search Engine Land Awards, where our industry gathers to promote and celebrate innovation.”

Google’s Djang said he’s more and more impressed each year by the entries submitted, and that this year’s nominees completely blew him away.

“All of you in this room have raised expectations for what’s possible in advertising,” said Djang.

For anyone unable to attend last night’s event, here is the full list of Search Engine Land Award winners.

The 2017 Search Engine Land Award Winners

Best Retail SEM Initiative: BrainlabsBest Retail SEO Initiative: iCrossingBest Local SEM Initiative: DAC GroupBest Local SEO Initiative: Odd Dog MediaBest Mobile SEM Initiative: Precis DigitalBest Mobile SEO Initiative: Wolfgang DigitalBest B2B SEM Initiative: Reprise MediaBest B2B SEO Initiative: iCrossingBest SEO Initiative — Small Business: Trimark DigitalBest SEM Initiative — Small Business: Todd SilversteinBest Overall SEO Initiative — Enterprise: SapientRazorfishBest Overall SEM Initiative — Enterprise: NeboBest Enterprises SEM Initiative — Travel & Lifestyle: MediahubBest Enterprise SEM Initiative — Financial Services: Merkle (US)Best Enterprise SEM Initiative — B2B: Metric TheoryBest Enterprise SEM Initiative — B2C: BrainlabsBest Cross-Channel Integration of Search: Precis DigitalBest Overall Search Marketing Initiative: Wolfgang DigitalSearch Marketer of the Year: Genevieve Head-GordonSearch Marketer of the Year: John LincolnIn-House Team of the Year — SEM: UPMC Health PlanIn-House Team of the Year — SEO: Comcast XfinityAgency of the Year — SEM: WpromoteAgency of the Year — SEO: HigherVisibility

Search Engine Land applauds this year’s award winners and sends thanks to all the participants who submitted entries. We can’t wait to see what you achieve next year!

SEO Ranking Factors in 2017: What’s Important and What’s Not

As technology advances, search engines can refine their ranking algorithms to better determine relevance and return results that better align with searcher intent.

Because these ranking algorithms are constantly being improved and refined, search engine ranking factors are always evolving. Factors that might once have had a huge impact on search rankings may no longer matter all that much, and new ranking factors (such as mobile-friendliness or HTTPS) can emerge to reflect changing technologies and user behaviors.

So, what are the most important ranking factors today, in 2017? A panel at SMX East, “SEO Ranking Factors in 2017: What’s Important and What’s Not,” sought to answer that question. This panel featured data from large-scale studies performed by SEMrush and Searchmetrics, as well as case studies and practical advice for adapting your SEO strategies to current realities.

SEMrush Ranking Factors 2.0

The first panelist was Olga Andrienko from SEMrush, who shared the results of a large-scale study on ranking factors that examined the top 100 positions for 600,000 keywords. Keywords were grouped by search volume into the following categories:

Very High: 10,001 monthly searches and upHigh: 1,001 to 10,000 monthly searchesMedium: 101 to 1,000 monthly searchesLow: 1 to 100 monthly searches

SEMrush looked at on-page factors, referring domains and traffic data, then compiled their findings to see which ranking factors appeared to be the most important. Here were some of their findings:

Website security (HTTPS)

SEMrush found that 65 percent of domains in the top three positions for Very High volume keywords are already secure. Although it’s not a huge ranking factor, Andrienko recommended switching to HTTPS to help with conversions and building trust.

Content length

SEMrush found that content length generally had a positive correlation with search rankings; content for pages in the top three positions is 45 percent longer, on average, than content in the 20th position.

Even so, Andrienko did not recommend simply writing a ton of content in order to rank better — the key is to write sufficiently long content that is relevant and matches user intent. Look at what your competitors are doing, and figure out how you can create content that provides more value to users.

Keywords

SEMrush had some interesting findings with relation to keywords. They found that:

35 percent of domains ranking for high-volume keywords don’t have the keyword in the title. This suggests that Google’s algorithms are getting better at understanding context/synonyms, and/or that keywords in the page title are becoming a less important ranking factor.Very few links contain a keyword in the anchor text — in fact, even among Very High volume keywords, only 8 percent of link anchors included a keyword. This may suggest that keywords in anchor text are not a major ranking factor, but it also might be a reflection of SEOs adhering more strictly to link-building best practices that see anchor text links as spammy.

Website traffic

SEMrush exclusively studied website traffic’s impact on rankings. They found that the number of visits matters for high-volume keywords.

Interestingly, search traffic specifically did not appear have any impact on rankings; however, direct traffic does.

User signals

The SEMrush study also looked at various user signals, including:

bounce rate. Overall, bounce rate is low for the top three positions but gets higher as you go down — this could suggest that top-ranking sites have more relevant content, better site speed, higher user trust and so forth.pages per session. Higher pages per session correlates with rankings, too.

Andrienko suggested that Google does not directly take user signals into account, but that if they’re low, that means users aren’t engaging with your site as they should be.

Links

High-quality link building is still super-important, both in terms of referring domains and “followed” backlinks. Andrienko noted that backlinks matter, especially for sites targeting keywords with fewer than 10,000 monthly searches.

What factor is most important?

Interestingly, SEMrush found that user signals and (direct) website traffic were actually the highest predictors of top rankings. Andrienko theorized that this was because top-ranking sites (i.e., those on page 1) are all doing on-page optimization well, meaning that Google needs new criteria to differentiate among these sites.

See Olga Andrienko’s full presentation here:

SEMrush Ranking Factors 2.0: SEMrush 2017 Study With Unreleased Updates By Olga Andrienko from Search Marketing Expo – SMX

Why General Ranking Factors Are Dead!

Next up was Marcus Tober from Searchmetrics. His company also analyzed ranking factors, but rather than look at factors by keyword search volume, he looked at factors by general trends versus individual industry/niche trends.

Tober noted that, while there are broad, general trends in terms of overall ranking factors, specific industries and niches seem to weight certain ranking factors more heavily. Here are some of Searchmetrics’ findings:

General trends

Everyone is improving their page load time across the board. While this isn’t a massive ranking factor, it’s important to see how you compare to your competitors so you don’t get left behind.

Like Andrienko, Tober found that keywords in titles are not that important. Indeed, only 48 percent of top-ranking (position #1) websites have their keywords in the title tag, suggesting that Google is getting better at judging relevance without this factor.

Searchmetrics also found that word count for top-ranking pages is increasing. Both Tober and Andrienko note that word count correlates with rankings, but they also advise to not just “go big” on content and hope for an increase in rankings.

Industry-specific trends

Tober found that different ranking factors seemed to be weighted differently depending on the query itself, so Searchmetrics broke out ranking factors by industry in their study (specifically looking at e-commerce, finance, health, media and travel).

The study looked at how ranking factors within each of these industries were weighted against the average — this provided some insight into which ranking factors are most relevant for each of these industries.

For example, HTTPS is a bigger deal for finance sites, as those require more user trust; however, it does not seem to be as heavily weighted for travel sites. Usage of images, on the other hand, was not so important for finance websites but had a larger impact for travel sites.

The point here is user intent: What does the user want? That is naturally going to be different for different industries.

Niche-specific trends

It isn’t just different industries that have different ranking factors, and Searchmetrics also looked at more niche types of websites to see what trends they could fine. This included dating sites, SEO services sites and recipe sites.

Again, Tober found that certain ranking factors were weighted differently based on niche. For example, HTTPS usage is high among SEO sites but not among dating and recipe sites. On the other hand, use of structured data and Schema.org markup was highest among recipe sites — likely because recipes have valuable rich snippets associated with them in SERPs.

Overall, Tober’s message was that ranking signals are relative to your industry and niche, so consider what your users need when considering how you structure your site and create content for your pages. He echoed Andrienko’s call to look at your competition and see what they’re doing.

See Marcus Tober’s full presentation here:

Why General Ranking Factors Are Dead! By Marcus Tober from Search Marketing Expo – SMX

How to put these findings into action

The final speaker was Herndon Hasty, digital marketing manager for The Container Store. His presentation was more focused on taking the data and findings from previous speakers and providing practical applications. He used case studies to illustrate his own findings, too.

Site speed

Site speed is a longstanding ranking factor, and it’s becoming more important as mobile usage continues to rise. Here are Hasty’s main recommendations for improving site speed:

Caching. Find more elements on your site that you aren’t currently caching or that you should be caching for a longer time period. (Basically, this ensures that the page can load faster for anyone who’s been to your site before.)Combining your external files. Reducing your page size by hosting your JavaScript and CSS on external files can be a great idea, but not if you have 30 to 40 external resources that need to be called. Try to consolidate your external files where possible.Managing your tags. Remove vendor tags you’re not using, and be sure to have the latest versions of the tags you are using.Image optimization. Hasty believed that image size is the biggest factor impacting site speed. Any time you can shrink an image, it’s going to improve your page speed. Don’t use images that are too big — load large versions only when customers want it! Whether it’s product images, repeated elements or logos, make sure you fit images to their exact space.

HTTPS

SEOs have been working on securing their sites ever since Google announced back in 2014 that HTTPS would provide a slight ranking boost.

When switching to HTTPS, you do need to consider site speed, as it will slow down the site a bit — but many still believe the switch is worth it, as there may come a day when Google makes HTTPS a requirement, similar to the mobile-friendly update.

The unfortunate part about switching to HTTPS is that it carries all the risks and challenges of a site redesign but without any of the fun.

Because you’ll need to implement HTTP to HTTPS redirects throughout your entire site, this does at least present a great opportunity to take care of any URL changes that you want to make.

The biggest element that often gets missed in an HTTP to HTTPS migration is canonical tags. Updating your canonical tags is critical, as your site can experience a loss of traffic and site performance due to out-of-date canonicals.

Meta data

Somewhat contrary to the findings above by SEMrush and Searchmetrics, Hasty has found that titles can have an impact and do make a difference, especially for lower ranking pages.

Meta descriptions are, of course, not a ranking factor, but they can improve click-through rates. What works in meta descriptions? It’s different for all niches and industries, but Hasty has found that including the following in your meta descriptions tends to increase performance:

Keywords.Free shipping/returns.Brand names people know.“Official site” (a trust signal).Promotions and sales.

SERP changes

Changes to search engine results pages (SERPs) can really drive down click-through rates — featured answers, more/larger rich snippets, product listing ads and the addition of a fourth text ad to some SERPs have all led to less above-the-fold page real estate for organic results.

These are things you largely can’t control, but you can still adapt your strategies to this changing reality. For example, Hasty recommended seeking “instant answer boxes,” also known as featured snippets. These are showing up for more generic terms and take up 15 to 90 percent of above-the-fold SERP space — plus, your digital assistants will read these in response to voice search.

To obtain a featured snippet, you need to be on the first page, but you don’t even need to be in the top five. Hasty suggests using structured data where possible, too — this will help you capture the correct search intent as Google gets smarter and better at understanding query intent.

See Hasty’s full presentation here:

SEO Ranking Factors 2017: What's Important, What's Not By Herndon Hasty from Search Marketing Expo – SMX