How independent reviews influence Google’s trust in your brand

Search Engine Land columnist Kevin Lee recently wrote a post about the prevalence of fake reviews, how they are damaging consumer trust and why it’s a bad move with permanent repercussions to attempt to use them yourself.

The reason for this growing problem is that online reviews have tremendous influence over the purchasing decisions of consumers, as well as the performance of brands in the search engines. Luckily, many major review sites — including Google, Amazon and Yelp — are taking steps to combat the issue.

With all of this in mind, now is a good time to address how to approach online reviews in an ethical way that will produce long-lasting, positive results for brand perception and search engine traffic.

Google associates trust with ratings and reviews

It’s important to establish the relationship between user reviews and SEO performance before moving forward. Understanding that relationship will inform how to best approach and build a strategy for earning reviews.

A recent study affirmed the strong correlation between ratings and search engine performance for local businesses. The study was conducted by LocalSEO Guide and worked in cooperation with the University Of California, Irvine as well as PlacesScout. It analyzed the correlation between over 200 potential search engine factors and rankings for over 100,000 local businesses.

Specifically, the study found that if a keyword is found in reviews of your business, or if your location is mentioned in a review, those enhance your rankings in the search results.

Do reviews enhance your performance in general search results, outside of local search?

That is a bit more contentious. Google itself has stated that star ratings in AdWords enhance click-through by up to 17 percent, and a study by BrightLocal has found that organic listings with 4- and 5-star ratings (in the form of rich snippets) enjoy a slightly higher click-through rate than listings with no stars. While there’s never been a formal confirmation, there is a great deal of evidence to suggest that higher click-through rates (CTR) may indirectly enhance your rankings in the search results.

Even if reviews don’t directly impact search rankings, the fact that they enhance click-through rates may potentially affect your rankings in an indirect fashion. And increased CTR is a benefit in itself!

User-generated content and reviews also heavily influence consumer decisions. A study by TurnTo found that user-generated content influenced buyers’ decisions more than any other factor looked at in the study, including search engines themselves.

The fastest way to success

Google has made it easy for you to get your customers to review you, and this is the very first thing you should start with.

Find your PlaceID using the lookup tool that Google has provided here. Put your business name in the “Enter a location” search bar. Click on your business name when it appears, then your PlaceID will pop up underneath your business name.

Copy the PlaceID and paste it to the end of this URL:

For example, the Macy’s location listed above would have the following review URL:

Now, try that URL in your browser with your business’s PlaceID to test whether it works or not. It should take you to a search result for your business with a “Rate and review” pop-up window.

Share this URL with your customers after transactions to pick up reviews on your Google My Business account.

While the Google My Business reviews are likely to have the largest impact on search engine rankings, they are not the only reviews Google takes into consideration, and it is in your best interest to pick up reviews from third-party sites as well. Third-party review sites can help you pick up more reviews more quickly, and they add diversity to your review profile, which enhances your legitimacy. This, in turn, imbues the reviews with greater authority.

In addition to boosting the authority and diversity of your reviews, third-party review sites help in a few other ways. Many are designed to make it simple to request reviews from your customers in an organized way. (Though be advised that some, like Yelp, discourage review solicitation.)

6 more tactics for picking up reviews

If you want to take things further, listed below are a few more tactics for you to consider working into your review strategy:

    Identify any industry-specific review sites: Reviews from industry-specific sites (think Avvo for lawyers or ZocDoc for doctors) can be huge, especially if you know that your potential customers are using these sites. It’s important to identify which vertical review sites may be relevant to you and to devise a strategy for earning positive reviews on these sites.Seek reviews from product bloggers: While blogger reviews are an entirely different ballgame from user reviews, they are equally important. Links from trusted bloggers are a strong signal that can positively affect your search engine rankings, and if the bloggers have audiences who trust the reviewer’s opinion, their reviews can earn you referral traffic with conversion rates not achievable from most sources. Just be sure that the blogger discloses any financial arrangement you might have with them.Respond to your reviewers: So long as you handle it tactfully, responding to reviewers (including and perhaps especially negative ones) can have a tremendously positive impact on brand perception, as it shows that you care about your customers. The important thing to remember about responding to reviews is that your response is not only for the customer but also for anybody else who sees the interaction. How you treat that review is how they will expect to be treated.Contact your happiest customers: It goes without saying that the happiest customers are the ones most likely to leave a positive review. Tactfully encouraging these customers to leave reviews is an important move if you want people to perceive you in a positive light. (Just be sure that you understand each site’s review solicitation guidelines.)Use social media for customer support: While social media shouldn’t replace a customer support team, many consumers see social media as a place to solve any problem they are having with their product. Many also use social media as a place to complain, often without even trying to contact your business. Be prepared for this, and respond to any mentions of your brand on social media with an offer to help. Don’t make the mistake of asking them to talk to you and take the conversation offline. Keep it online and portray yourself in the best way possible.Ask the right questions: Whatever media you are using to encourage your customers to leave a review, it’s important to make sure you are asking the right questions. Asking them simply to let people know if they liked the product typically isn’t the way to go, since it leads to very generic reviews. Ask more specific, pointed questions about how the product helped them solve a particular problem. These are the kind of stories that encourage people to purchase a product.


Online reviews play an incredibly important part in a buyer’s journey, from interest to purchase. They have a heavy influence on rankings in local search results and play an important part in more traditional search engine performance as well.

Brick-and-mortar businesses should use thank-you emails and other customer communications to point consumers to their Google My Business pages. Take advantage of third-party review sites to easily encourage reviews. Reach out to your customers and online influencers to improve coverage of your products.

Do not neglect these efforts. User reviews influence modern purchasers heavily. If your product is strong, your efforts will pay dividends.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Law firms spamming Google My Business: Don't trust them!

Last year, I wrote a piece addressed to SEO companies showing how much they were spamming Google Maps and giving the industry a bad reputation. If I worked at Google, this type of stuff would make me hate SEO companies and have no desire to help them.

Lately, I’ve been seeing this same level of spam (or worse) in the legal industry. If you’re an attorney or a marketing agency that works with attorneys, this article is for you.

Personally, if I were looking to hire an attorney and trust my money and my life to someone, the last place I would look is Google, due to my knowledge about how unreliable the information is and how fabricated the reviews are. Let’s get into some specifics.

Fake reviews

Attorneys often complain about how hard it is to get their clients to leave reviews. I get it. Someone rarely wants to publicize who they hired to help them with their divorce or admit that they had to hire a criminal lawyer. This does not, however, excuse what attorneys are doing to get reviews in spite of this.

One common trend amongst attorneys currently is review swapping. Although sites like Avvo might have sections that encourage peer reviews, they do a good job of separating them so that consumers realize they are not reviews from clients.

Google has no such distinction and is very clear in their guidelines that reviews should be about the customer experience. Attorneys you are friends with all around the country do not count as customer reviews. I say this because so far, every review that fits this scenario that I’ve reported to Google has been removed.

In addition to violations of Google’s guidelines, quid pro quo attorney review circles may violate attorney ethics rules. According to Gyi Tsakalakis, a digital marketer with a focus on law firms:

Per the ABA Model Rules, with limited exceptions, lawyers aren’t supposed to give anything of value to a person for recommending the lawyer’s services. The quid pro quo nature of some of these review circles could be construed as a violation of this rule. At the very least, these communications could be interpreted as misleading, which is also prohibited by most states’ rules of professional responsibility.

There also could be legal implications to review swapping. In addition to it being against Google’s guidelines, it could also get you in trouble with the FTC. In an article I wrote on fake reviews earlier this year, Brandon J. Huffman, attorney at Odin Law, mentioned:

The FTC looks at whether you got something of value in exchange for your review. The thing of value is usually cash or a free product of some kind, but the positive review you receive is also something of value. So, this is really no different than a typical paid-for review under the regulations. Businesses would need to disclose that they received a positive review in exchange for their positive review.

Review swaps aren’t the only thing that can get lawyers in trouble with their state Bar Associations. A variety of fake review tactics are likely to lead to sanctions, such as having your employees pose as clients to leave reviews or paying someone to write fake reviews. Indeed, many law firms are just flat-out getting fake reviews posted.

Recently, in looking at the top 20 listings that ranked for personal injury lawyers in a major city in the USA, I found eight that had fake reviews (40 percent).

Fake listings

The most common practice for attorneys who want to rank in several cities is to create listings at virtual offices. When these are reported, Google has been pretty good at removing them. However, attorneys (and their marketing companies) are getting smart at this stuff and have found ways to trick Google My Business support into thinking their fake locations are real locations.

These are also clearly false, or at least misleading, communications about the lawyer’s’ services — a clear violation of attorney ethics rules.

Fake photos

I have experienced this one many times. An attorney will submit photos on their listing that “prove” they exist there, even though the address belongs to a virtual office service provider. These photos are often:

• photoshopped.
• signs that were taped to a wall, only to be removed after the photo was taken.
• photos of a completely different location.

I actually visited an office recently that an attorney was using for a listing on Google. The photos of the signs that he posted did not exist there in real life. So he was willing to actually show up at the office and tape signs to the wall just to “show” Google that he is really at that location. There is a word we use in my circles to describe this type of thing — and it’s called lying.

As business author Stephen Covey says:

The more people rationalize cheating, the more it becomes a culture of dishonesty. And that can become a vicious, downward cycle. Because suddenly, if everyone else is cheating, you feel a need to cheat, too.

Using other attorneys’ addresses

This is another tactic I’m seeing on the rise in the attorney world. One attorney will get another attorney to accept the postcard from Google My Business so they can get an “address” in that town. Usually, they aren’t competition and practice different types of law, so there isn’t any negative impact on either party. This is also against the guidelines, and when caught, will be removed by Google.

I’m seeing more and more videos being used as evidence on the Google My Business forum to help prove businesses don’t exist at the address they are using. User Garth O’Brien posted another clever idea as a comment on an article by Mockingbird Marketing:

I was aware of a local law firm that did this in Washington. Their competitors called up each city and pointed out that law firm had a physical presence within their city. They inquired if that law firm was paying B&O tax in each city. The law firm was not, so each city called up and asked them to fork over some tax money. That law firm quickly erased each profile for every city [where] they did not have a physical presence.

Keyword stuffing

The final tactic I see being used frequently is keyword stuffing. It’s an old trick that still works well. If you want to rank higher on Google, just shove “Best Attorney Ever City Name” into your business name field in Google My Business.

Good grief…You would think there is a point where keyword stuffing actually hurts the performance of a listing. #StopCrapOnTheMap

— Joy Hawkins (@JoyanneHawkins) November 30, 2016

The problem is that Google will remove the keywords when they catch you. I have also seen them recently suspend a listing for an attorney who wouldn’t stop doing it. Currently, this guy has no ability to edit or control his listing on Google.


If you are sick of the spam you see in the legal industry, please to continue to report it on the Google My Business forum. I urge you not to let these people get away with the tactics they are using. Also, no matter how tempting it is — never join them!

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Private blog networks: A great way to get your site penalized

You may have heard about private blog networks (PBNs) before, but you may not be sure what they are or why they are used. A PBN is a network of websites used to build links (and therefore pass authority) to a single website for the purpose of manipulating search engine rankings. This scheme is similar to a link wheel or link pyramid, as it involves several different websites all linking to one another or to one central website.

While these types of schemes were used commonly years ago, PBNs are now considered a pure black hat tactic and should be avoided at all costs, as they can lead to a loss in rankings, or even a manual penalty. PBNs usually provide little to no long-term value to the websites they are linking to.

Google has long been fighting PBNs, and businesses caught up in this shady tactic have been made an example of over the years. One such case was the J.C. Penney link scheme that was exposed back in 2011 by The New York Times. As Google gets smarter and develops better technology to combat link spam techniques, it has become harder and harder for black hat SEOs to pull off a PBN successfully.

How to identify private blog networks

The key to identifying a PBN is the cross-site “footprint” where much of the technical data on the sites are the same. Old PBN networks were on the same IP, shared servers, had the same WHOIS information, or even used the same content across sites.

Today, PBNs are much more sophisticated and may be harder for users to spot because the sites span different industries, topics and layouts. When determining if a site is part of a PBN — and therefore one that you should avoid like the plague — consider the following:

Hosting. Are they all on the same IP? You can use or similar tools to identify what sites are hosted with any other site.Site design. Do the sites all use a similar design, navigation, color scheme?Similar themes. WordPress themes sometimes have the theme name in the code. Check the source code in your browser.Site ownership. Check WHOIS database for the contact information for the owner of the sites. Having hidden WHOIS data is a red flag. If all of the site owners are the same, it’s obvious the blogs are connected.Duplicate content. Copy a paragraph into Google search to see if the content exists on other sites.Backlink profile. Check the backlink profile in Ahrefs or Majestic (these are the largest databases of links) to see how much interlinking is occurring between sites.Images and videos. Since videos and images are difficult and expensive to recreate, these are likely going to be duplicated on other sites. Use Google image search or video search to find similar pieces.

A dead giveaway for many PBNs is having a similar backlink profile. If multiple sites have the same link profile, or if they all link to one website multiple times (especially where it seems like overkill or it isn’t relevant), then the site is likely part of a PBN — or, at the very least, is selling links. Google’s Penguin algorithm, which now runs in real time as part of the core ranking algorithm, can detect these kinds of schemes and devalue your website rankings as a result. In some cases, you could even wind up with a manual penalty.

However, simply owning several different websites doesn’t mean you are a private blog network. For example, media companies that own several sites and link to them in all website footers wouldn’t likely have to worry about being flagged as a PBN unless the websites weren’t related, there were dozens of links in the footers, or they were linking to similar internal pages repeatedly.

In addition, PBNs are generally groups of sites all owned by one company or individual, but separate individuals who are working together to link to one another could also be considered a PBN if there is a pattern of repeatedly linking to the same sites or pages across several different groups of websites.

How can you protect your site from PBNs?

No reputable SEO consultant will recommend private blog networks for link building or increasing website traffic. Unfortunately, your site may be involved in a PBN without your even knowing it, especially if you are outsourcing your link building activities to a third party. Buying links on sites like Fiverr or through other services may put your site in grave danger. And if anyone tries to convince you to participate in a link exchange (i.e., trade links with them), run.

Strong oversight of link-building activities is key. Educate yourself on which practices Google considers to be link schemes, and ensure that anyone responsible for building links to your site is strictly adhering to these guidelines; any reputable link builder should agree to be transparent about the links they are pursuing for you.

This will require some effort on your part, but remember: Just because you aren’t aware of what goes on behind the curtain doesn’t mean you won’t be held responsible for the consequences.

Best practices will ultimately win the day

You might feel frustrated by competitors who appear to be using spammy link-building techniques like PBNs. You could report them through a webspam complaint, of course. But even if you don’t, remember that their black hat tactics will eventually catch up to them.

While your competitor is relying on a PBN to get links, your company can build out more robust link-building campaigns based on best practices that have more staying power and aren’t frowned upon by search engines. Then, when your competitor gets busted and is demoted, deindexed or otherwise penalized, your site will have the advantage.

As a whole, private blog networks are a dangerous and unacceptable link-building strategy. A link should only be given when it truly provides value to the user — anything to the contrary may result in less visibility within search engine result pages, or even a manual penalty.

Save yourself and your company the headache of lost money, resources and time, and focus on better link-building tactics that will get you results without the strife.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

13 outdated SEO tactics that should terrify you

As we approach Halloween and our Netflix queues again fill up with all manner of spooky, startling and downright horrifying monsters, I’m reminded of another kind of monster we should all be afraid of: outdated SEO tactics.

These tactics range from harmless but ineffective (like Casper the Friendly Ghost) all the way to completely devastating (like Freddy Krueger). And much like the bad guy in so many of the horror movies we all grew up watching, these tactics never seem to die, despite common sense, SEO professionals, and even Google warning people away from them.

So today, we’re going to delve into 13 outdated SEO tactics that you should be terrified of and avoid at all costs.

1. Link and article directories

Link directories are generally useless today, with the exception of high-quality, niche-specific directories that follow strict editorial guidelines.

Long before search engines were as powerful and effective as they are today, link directories served as a way to categorize websites so that people could find what they were looking for. Thanks to the simplicity of installing and using the software that powers them, marketers’ insatiable appetite for fast and easy links, and website owners’ hunt for additional revenue streams, link directories exploded in popularity.

But, since they didn’t provide any real value to visitors, search engines began to ignore many of these link directories — and they quickly lost their effectiveness as a link-building tactic. Eventually, link directories became a toxic wasteland of low-quality links that could actually get your website penalized.

Article directories are even worse. What started off as a way to share your brilliant insight with a larger audience while earning links, this tactic was quickly abused. Marketers began using software to “rewrite” their articles and submit them to thousands of article directories at a time.

As with link directories, article directories — now bloated with low-quality content — simply hit a point at which they provided no value to visitors. Marketers just used them for fast and easy links. Indeed, the glut of low-quality content flooding the web through these article directories appeared to be the proverbial straw that broke the camel’s back right before the release of Google’s Panda update, which decimated countless websites.

With the exception of high-quality, niche-specific link directories — and you may only find one or two in any given industry — you should avoid link and article directories entirely.

2. Exact-match domains

For a while, exact-match domains (EMDs) were a hot topic because they became a silver bullet for search engine optimization. It was easy to throw up a microsite on an exact-match domain and rank far more quickly than a traditional, branded domain — often in weeks, sometimes in days.

With an EMD, your domain matches the exact keyword phrase you’re targeting. For example:

But much like a werewolf when the full moon wanes, EMDs quickly lost their power as Google adjusted their algorithm.

Exact-match domains have the potential to rank as well as any other domain, but they also seem to have a higher potential to be flagged for spam, either algorithmically or manually. They become an even riskier proposition when you consider that they generally aren’t as “brandable,” and as a result, the domain will generally be viewed as less trustworthy, which can reduce conversions and make link building more difficult.

3. Reciprocal linking

Search engines view a link to another site as a “vote” for that site — so reciprocal linking is essentially saying, “If you vote for me, I’ll vote for you.” This is the very definition of manipulative linking practices, yet that didn’t stop millions of marketers from blindly trading links, even with websites that had zero relevance to theirs.

Worse yet, rather than links embedded within valuable content, these links were often simply dumped on a “links” or “resources” page, sometimes broken into categorical pages, along with hundreds of other links, offering no value to visitors.

This tactic, though ineffective today, still stumbles slowly along like a putrid and rotting zombie, more than a decade after its death.

4. Flat URL architecture

This isn’t really a “tactic” as much as it is just the default way WordPress is set up, and most people don’t know that they need to change it.

Ex. 1:


Ex. 2:

A flat URL structure (Ex. 1) makes it more difficult for search engines to understand the hierarchy of your website because all of your pages are treated with the same level of importance, while a faceted or nested URL structure (Ex. 2) clearly communicates the importance of each page within your website in relation to every other page within your website.

The first step is to change your default permalink settings. Then, if you haven’t already, publish your second-level pages, and create corresponding blog categories; or, if they already exist, move them and set up any applicable redirects.

The slugs for your categories must exactly match the slugs for your second-level pages. This little detail is critical because it determines how search engines will value each page within your website relative to other pages within your website.

Once properly configured, each third-level page and blog post will appear as a sub-page of the applicable second-level page based on the blog category it is assigned to. In other words, each third-level page/post adds more authority to the page it appears nested under.

It’s important to think this through thoroughly because changing it later means having to redirect all of the pages in your website and potentially losing ranking.

5. Indiscriminate guest blogging

Contrary to what some people claim, guest blogging is far from dead. However, it has changed dramatically. To fully understand the context, it’s important to understand the evolution of guest blogging over the years.

Guest blogging has roots in traditional public relations. The basic premise is that you’re trying to leverage a larger, existing audience by publishing your article on an established publication. This helps you to:

create more authority, credibility, and trust.demonstrate your a personal brand.

In the early days, you would seek out publications for guest posting opportunities based on the size and, more importantly, the relevance of their audience. The intent was to get in front of more of the right people, and this involved writing killer content that their audience would find valuable, which would usually include a short bio, and maybe even one or more links back to your own website.

Website owners attempting to keep Google happy by constantly adding fresh content were all too eager to publish a steady stream of posts from guest authors, and because links are the lifeblood of SEO, people quickly latched onto this tactic to build links and sucked the life out of it like a ravenous vampire.

Marketers soon began submitting guest posts to any website that would accept them in an attempt to acquire a link.

Your website is about construction? Great! Let me submit an article on construction trends, along with a bio that includes a link back to my crochet website — relevance be damned! The next predictable step was that many marketers began submitting completely off-topic articles, and website owners eagerly published them.

This is why we can’t have nice things.

Google understandably showed up like a mob of angry villagers with pitchforks and torches to put an end to this nonsense and, as they often do, created a lot of collateral damage in the process. Websites were penalized, and while some took years to recover, a many never did, so their owners had to start over on a new domain. A lot even went out of business.

For a while, people shied away from guest blogging, but today, it’s returned to its traditional roots.

6. Keyword stuffing

Back when search engines were only capable of interpreting simple signals, like keyword density, stuffing keywords by the truckload into a web page to make it seem more relevant was all the rage. What should have been just a few instances of a particular phrase sprinkled throughout a web page grew faster than a zombie outbreak.

This doesn’t work — and more importantly, it makes it look like you employ drunk toddlers to write your copy, which doesn’t do much to inspire trust in your company.

7. Exact-match anchor text

At one point, anchor text — the clickable text of a link — was a huge ranking factor. For example, if you wanted to rank for “Tampa contractor,” you would have tried to acquire as many links using Tampa contractor as the anchor text as you could.

Marketers predictably abused this tactic (seeing a trend yet?), and Google clamped down on it and dropped the ranking for websites with what they deemed to be unnatural amounts of keyword-rich anchor text backlinks.

The anchor text distribution for a natural link profile will generally have a lot of variety. That’s because if 100 different people linked to the same page on your website, each link would likely be used in a different context within their content. One person might link to your web page using anchor text that describes the product (“blue widgets,” for example), while another may link using anchor text that describes the price, and yet another might even link using nondescript anchor text like “click here” or something similar.

Below is an example of the anchor text distribution for Search Engine Land.

The majority of your anchor text will not be an exact match to the keyword topics you’re targeting unless they are part of your brand or domain name. And this is OK because today, rather than anchor text, Google places more emphasis on:

the relevance of the linking website to your website.the authority of the linking website.the number of relevant links from authoritative websites pointing to your website.

I wouldn’t put too much effort into controlling the specific anchor text that others use to link to your website — it’s a waste of time, and it can potentially harm your ranking if you go overboard and create an unnatural pattern. The majority of anchor text for most websites with a natural link profile will generally be for branded terms anyway.

8. Pages for every keyword variation

Keyword phrases, in the traditional thinking, are dead. The old approach involved creating a separate page for every keyword variation, but fortunately, search engines are a lot smarter today, so this isn’t necessary.

Google’s Knowledge Graph, based on latent semantic indexing, started to kill off traditional thinking, but RankBrain drove a stake into its heart. Today, websites that still follow this antiquated tactic perform a lot like the zombie hordes you see mindlessly wandering around in a George Romero movie in search of fresh brains to devour.

RankBrain is just a catchy name for Google’s machine-learning artificial intelligence system (Skynet was already taken, apparently) that helps it to better understand the user intent behind a query. It can even help Google to (appropriately) rank a web page for keyword phrases that aren’t in the content!

This means that if you write content for a page about HVAC services, RankBrain understands that it would also very likely be a good match for a user entering any of the following queries:

HVAC repair.HVAC maintenance.HVAC tune-up.HVAC cleaning.

If you’ve created individual pages for each keyword variation in the past, you may be tempted to leave them and just stop doing it in the future, but that’s not enough. You need to prune the unnecessary pages, merge content that can be merged, and create any applicable 301 redirects, because these unnecessary pages will have a negative impact on how Google views your website, and how often and how thoroughly it is crawled.

So, instead of creating an individual page for every keyword phrase you want to rank for, create a more comprehensive page for a keyword topic. Using the HVAC example we mentioned earlier, this would involve creating a page about HVAC services, along with a subheading and content for each of the additional highly-related phrases.

9. Paid links

Paying for PageRank-passing links has been a clear violation of Google’s webmaster guidelines for a long time, but like the machete-wielding protagonist at Camp Crystal Lake, this one simply refuses to die.

I take a pragmatic view to buying links: They can work to improve your ranking in the short term, but you may eventually get caught and penalized, so is it really even worth it?

You might think you can be really careful — buy just a few links to get some traction and stay under Google’s radar — but that’s not going to happen. They are always hunting for both link buyers and link sellers, and it’s shockingly easy because all they have to do is follow the links.

You might get be thinking, “Pffft… I know what I’m doing, Jeremy! I’m careful when I buy links!” Sure you are. But can you say the same thing about the site owners you buy the links from? Or everyone else who buys links from them?

Let’s say Google catches one link buyer by identifying an unnatural pattern of inbound links — all they need to do next is evaluate the outbound links of anyone linking to that buyer to identify more link sellers. In turn, that will uncover more link buyers, which again uncovers more link sellers.

See how fast it all goes downhill? So just don’t buy links.

10. Low-quality content

I recently gave a presentation on digital marketing to a group of franchisees of a large national brand. While discussing the type of content they should be producing for their websites, one of the franchisees frustratedly said, “I can’t write articles for my website — it takes too much time and effort just to do what I’m doing now!”

Effective SEO requires you to regularly produce amazing content — which is, understandably, difficult for time-strapped marketers. A lack of time and resources can often lead to rushing content creation, or worse yet, outsourcing it to non-English speakers or budget services like Fiverr or Upwork. The resulting content is often the text equivalent of the unintelligible grunts from Frankenstein’s monster.

The days of simply producing content just for the sake of publishing something, are, fortunately, far behind us thanks to Google’s Panda update in 2011. Since then, the algorithm has been further refined and worked into the core algorithm.

Your content should be robust, well-written, accurate and engaging. There is no minimum or maximum ideal length; it just needs to be long enough to serve its purpose. Sometimes that may mean just a few hundred words, and other times, that may mean several thousand words.

While we’re on the subject of writing content…

11. Writing for bots rather than people

If you’ve ever seen a web page or an article that repeats a particular keyword over and over, awkwardly forces a keyword phrase into a sentence in a way that doesn’t make sense or incorporates unnecessary heading tags, then you’ve probably seen an example of someone writing for bots rather than people.

SEO has come a long way since the early days, when we had to really spell everything out in order for the search engines to understand and rank a page. You don’t need to do that anymore. Write for people, because they will be the ones buying your products or services.

12. Creating multiple interlinked websites

There are two approaches to creating multiple interlinked websites — and neither one is an effective SEO tactic today.

The first approach is interlinking several legitimate websites that you own. This is the lesser of two evils because if done properly, it won’t result in a penalty. However, it also won’t have much impact, if any, on your SEO efforts, since search engines place a high value on the number of linking root domains, not just the total number of links. Another black mark against this approach is that it reduces the resources you can direct to marketing your primary website.

An example of this being done properly would be when a residential home builder links to a mortgage company that they also own, because there is a high relevance between both websites.

The second approach, which is unquestionably black hat, is to create a series of websites just for the purpose of linking to other websites you own. Since this tactic requires you to create an ever-growing network of websites on such a scale that the only way to describe it would be a gremlin pool party, it’s an absolute certainty that you will also create a pattern that Google can identify, which will result in a penalty.

Instead of trying to build, manage and market multiple websites just to acquire a few measly links, focus your efforts on earning lots of high-quality links from other legitimate websites. An added benefit is that as those websites become more authoritative, their links to your website will become more powerful.

13. Automated link building

When links became an essential part of SEO, marketers predictably sought ways to maximize their link-building efforts using a variety of automated software programs. They blasted their links into guestbooks, blog comments and forums, submitted their websites to bookmarking services and link directories and spun poorly written articles by the thousands, for submission to every article directory they could find.

I’m all for automating certain tasks to improve efficiency within your business, but link building is not one of them because the only kind of links that can be built this way violate Google’s webmaster guidelines.

You can call me a purist, but there is simply no way to automate high-quality link building. That requires creating amazing content and developing relationships to earn links to it. There are no shortcuts.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Google News spammed with drug spam, dating sites & more

If you go to Google News and scroll down to the health section or click on the health section to load more health news, you may be presented with a ton of spam. It looks like Google News was injected with a ton of hacked content for pharmaceutical spam, as well as unrelated dating site spam.

Here is a screen shot of some of the articles showing up for me in the Google News health section right now:

Earlier today, this content appeared directly on the main Google News home page, but now it is only when you try to load more health news.

It is unclear if Google is aware of the spam issue and is actively cleaning it up at this point.

It is also unclear on how exactly this spam has entered into Google News. It may have been that these already approved Google News sites were hacked. So either the publisher needs to clean up the hack quickly or Google will drop them from Google News and the search index until the site is cleaned up.

Dealing with hacked content in Google is a growing issue Google is aware of.

We have emailed Google for more details on this spam attack.

Postscript: Google has confirmed the issue was on the publisher side and it appears Google has cleaned up the issue in Google News.

Google warns against misusing links in syndication & large-scale article campaigns

Google’s out today with a warning for anyone who is distributing or publishing content through syndication or other large-scale means: Watch your links.

Google’s post reminds those who produce content published in multiple places that, without care, they could be violating Google’s rules against link schemes.

No content marketing primarily for links, warns Google

Google says that it is not against article distribution in general. But if such distribution is done primarily to gain links, then there’s a problem. From the post:

Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. However, what does violate Google’s guidelines on link schemes is when the main intent is to build links in a large-scale way back to the author’s site …

For websites creating articles made for links, Google takes action on this behavior because it’s bad for the Web as a whole. When link building comes first, the quality of the articles can suffer and create a bad experience for users.

Those pushing such content want links because links — especially from reputable publishers — are one of the top ways that content can rank better on Google.

Warning signs

What are things that may tip Google into viewing a content distribution campaign as perhaps violating its guidelines? Again, from the post:

Stuffing keyword-rich links to your site in your articles

Having the articles published across many different sites; alternatively, having a large number of articles on a few large, different sites

Using or hiring article writers that aren’t knowledgeable about the topics they’re writing on

Using the same or similar content across these articles; alternatively, duplicating the full content of articles found on your own site

Staying safe

There are two safe ways for those distributing content to stay out of trouble: using nofollow on specific links or the canonical tag on the page itself.

Nofollow prevents individual links from passing along ranking credit. Canonical effectively tells Google not to let any of the links on the page pass credit.

Publishers can be at risk, too

It’s important to note that Google’s warning isn’t just for those distributing content. Those publishing it can face issues with Google if they haven’t taken proper care. From Google’s post:

When Google detects that a website is publishing articles that contain spammy links, this may change Google’s perception of the quality of the site and could affect its ranking.

Sites accepting and publishing such articles should carefully vet them, asking questions like: Do I know this person? Does this person’s message fit with my site’s audience? Does the article contain useful content? If there are links of questionable intent in the article, has the author used rel=”nofollow” on them?

In other words, publishing content unquestioningly, in terms of links, could expose the publisher’s site to being penalized in Google.

Why this new warning?

Today’s warning from Google is generally the same as what it issued back in July 2013, when it cautioned about links in large-scale guest posting, advertorials, sponsored content and press releases. However, it’s more specific in terms of syndication and comes because of an issue that Search Engine Land has been investigating over the past month.

Search Engine Land has a policy of generally not writing about cases of spam or suspected spam that aren’t already public in a significant way. Our open letter from 2014 explains this more. In short, if we did this, that’s all we would ever be writing about.

That said, we received a tip about several businesses using article syndication that seemed worth taking a closer look at, given that the tactics potentially violated Google’s guidelines in a significant manner. Moreover, Google had been notified of the issue at the end of last year, twice, but had not apparently taken any action. The company tipping us — a competitor with those businesses — was concerned. Was this tactic acceptable or not?

The many examples I looked at certainly raised concerns. Articles were distributed across multiple news publications. The articles often contained several links that were “anchor rich,” meaning they appeared to have words within the links that someone hoped they might rank well for. Mechanisms for blocking these links from passing credit were not being used.

Google’s initial response to our questions about this was that it was aware there were issues and that it was looking to see how it might improve things.

That seemed a weak response to me. It was pretty clear from my conversations with two of the companies distributing the content, and one of the publishers, that there was, at the very least, confusion about what was acceptable and responsibilities all around.

Confusion about what’s allowed

Both the companies producing content professed that they felt they were doing nothing wrong. In particular, they never demanded that publishers carry any particular links, which seemed to them to put them on the right side of the guidelines. One also said that it was using canonical to block link credit but that the publishers themselves might be failing to implement that correctly. Both indicated that if they weren’t doing things correctly, they wanted to change to be in compliance.

In short: it’s not us to blame, it’s those publishers. And from the content I looked at on publisher sites, it was pretty clear that none of them seemed to be doing any policing of links. That was reinforced after I talked with one publisher, which told me that while it did make use of nofollow, it was reviewing things to be more “aggressive” about it now. My impression was that if nofollow was supposed to be used, no one had really been paying attention to that — nor was I seeing it in use.

In the end, I suggested to Google that the best way forward here might be for them to post fresh guidance on the topic. That way, Search Engine Land wasn’t being dragged into a potential spam reporting situation. More important, everyone across the web was getting an effective “reset” and reeducation on what’s allowed in this area.

Getting your house in order

Now that such a post has been done, companies distributing such content and publishers carrying it would be smart to follow the advice in it. When Google issues such advice, as it did about guest blogging in January 2014, that’s often followed by the search engine taking action against violators a few months later.

From a distributor point of view, I’d recommend thinking strongly about how Google ended today’s blog post:

If a link is a form of endorsement, and you’re the one creating most of the endorsements for your own site, is this putting forth the best impression of your site? Our best advice in relation to link building is to focus on improving your site’s content and everything–including links — will follow (no pun intended).

Bottom line: Deep down, you know if you were putting out this content primarily to gain links. If that was the case, you should work with those publishers to implement nofollow or canonical. If you can’t, then you should consider disavowing links to Google.

Going forward, I’d look to implement nofollow or canonical as Google recommends, if you find that the large-scale distribution is bringing you useful direct clicks and attention.

I will say that no one should take this to mean that you can never distribute content or that content can’t have any links at all that pass credit back to an originating site. Indeed, we have plenty of contributed content here on Search Engine Land. I’d be among the first screaming at Google if I thought it was trying to tell us or anyone that you couldn’t have such content unless you blocked all links.

Things that make us feel Google-safe are that, most of all, we publish original content from contributors. It’s not the same content that’s simply dumped into multiple publications. Also, we have editors who often spend a significant amount of time working with writers and content to ensure that it’s publication-worthy. And we do try to watch for links that we don’t feel are earned or necessary in a story.

We’re not perfect. No publisher will be. But I think from a publisher perspective, the more you are actually interacting with the content you publish to review and approve it, rather than blindly posting from a feed, the safer you will be. If you haven’t been doing that, then consider making use of nofollow and canonical on already-published content, as Google recommended.

As for those guest blogging requests

I’ll conclude with this part of Google’s post today:

Webmasters generally prefer not to receive aggressive or repeated “Post my article!” requests, and we encourage such cases to be reported to our spam report form.

Indeed. It’s amazing how many requests that we’re getting like this each day, and I know were not alone. It’s even more amazing when this type of guest blogging was supposed to be over.

Stick A Fork In It, Guest Blogging Is Done,” declared Matt Cutts in January 2014. Cutts, no longer at Google, was then the head of its web spam fighting team. His declaration was a shot heard around the web. Guest blogging almost became radioactive. No one seemed to want to touch it, much less send out idiotic bulk emails requesting a post.

Those requests are back in force. It’s a pity that so many come from Google’s own Gmail system, where all Google’s vaunted machine learning doesn’t catch them as the spam they are.

If you’ve been making such requests or accepting guest blog posts because of them, even in small scale, Google’s rules about policing links still apply.

Google: We sent 9M web spam messages in 2016, more than double 2015 total

Google released its annual web spam report yesterday, documenting some of the company’s spam-fighting activity in 2016.

One of the most eye-catching data points for me was that Google has sent out over 9 million messages related to web spam in 2016; that number was more than double the 4.3 million messages in the 2015 report. The other metric that stood out was that hacked sites continue to rise, this time by 32 percent from 2015 to 2016, but it was a 180 percent increase from 2014 to 2015 in the previous report.

Here are some data highlights from this year’s report:

32 percent increase in hacked sites compared to 2015Over 9 million messages sent to webmasters to notify them of webs pam issues on their sitesStructured data manual actions taken on more than 10,000 sitesOver 180,000 user-submitted spam reports from around the world, down from 400,000 the year beforeOf those 180,000 spam reports, 52 percent of those reported sites considered to be spamMore than 170 Google online office hours and live events around the world to audiences totaling over 150,000 website owners, webmasters and digital marketersMore than 67,000 questions in the Google support forums

Google also noted in this report that they made the Penguin algorithm real-time in 2016.

Is Google really keeping fake listings off Google Maps?

Google announced last week that they had recently conducted a study to research the actors behind fake listings on Google Maps.

The study points out that “geographic proximity is the coin of the emerging localized-search realm,” which matches what I observed with the Possum update in 2016. This major algorithm update was responsible for making proximity to the searcher the #1 ranking factor for local search. This drives up the incentive for companies to create fake listings, since they need more “locations” in order to monopolize the search results and maximize exposure.

There were some things I saw in the study that were very helpful to know:

Most of the listings that got suspended were in India and the United States (74 percent of all the listings observed in this study).40.3 percent of the listings were in industries that are on-call, like plumbers, locksmiths and electricians.54 percent of the suspensions in the United States were in the following six states: California, New York, Florida, Texas, Illinois and New Jersey, while these states only account for 39.9 percent of the population.

The study also looked at listings that were suspended due to marketing companies abusing the verification process and obtaining ownership of the listing without the consent of the business owner. This happens a lot in the restaurant industry, where marketing companies will create a copy of the business website and put a booking/ordering link on it, then charge the business owner for all orders placed via their system. They then take ownership of the listing via Google My Business, and the business owner is stuck paying for transactions from any customer finding their local listing on Google. To make matters worse, they refuse to transfer ownership of the listing to the restaurant owner when he finally figures all this out.

When I first read the statement, “Our study shows that fewer than 0.5% of local searches lead to fake listings,” I actually laughed. But then I read the study and began to understand why they came to that conclusion: The study only looked at listings that were suspended.

In collaboration with Google, we examine over a hundred-thousand business listings that appeared on Google Maps between June 1, 2014 and September 30, 2015 and were subsequently suspended for abuse.

This didn’t look at all the listings the algorithm didn’t catch.  That statistic is not accurate whatsoever as a representation of all searches done on Google. They further state:

In our study, we treat any listing that was in a suspended state at any time as abusive, while we treat all other currently active listings as legitimate.

If Google’s algorithms were perfect and caught every single fake listing, this would be a good way to approach this. Unfortunately, nothing could be further from the truth.

The study also states that they found 25.7 percent of abusive listings were in the locksmith category. This makes sense, since the algorithmic triggers for a suspension for locksmiths are not the same as for other industries. Google has been much harsher on locksmiths in the last year, and we see way more suspensions for them on the Google My Business forum. This is really more of an indicator that Google is penalizing locksmiths more than other industries, not necessarily that more spam exists in that industry.

The announcement from Google highlights that they’ve reduced and disabled 85 percent of fake listings before they even appear on Google Maps. That statement is extremely misleading if you don’t realize they’re only referring to listings that Google’s algorithm actually caught, and in those cases, only 15 percent of the listings were live on Google Maps before getting suspended. It does not mean that Google actually catches 85 percent of fake listings.

I agree that Google’s algorithms are getting smarter and better at catching spam. Nevertheless, here are the major issues with Maps spam that this study didn’t address.

1. Google My Business support issues

The Google My Business phone support team seemingly lacks the ability to understand what spam is and how to detect it. I had a recent case of an attorney that I brought up to Google. This guy has over 20 fake listings at addresses that he rents via virtual office providers. They are all listed as open during regular business hours, and many of them aren’t within driving distance of one other. (In other words, you’d have to hop on a plane to get from one location to another.)

Here’s the kicker: there are exactly two lawyers working at this firm. Please tell me how two attorneys are able to operate over 20 locations at the same time. This seems like common sense to me, but apparently not to Google, which decided to keep them all active.

Advanced Verification will never solve the problems the algorithm doesn’t catch if the operators looking at the listings don’t know how to tell what’s real vs what’s not. It’s only launched in a few areas in California, and yet here are three examples from the last month that somehow passed advanced verification and shouldn’t have:

A plumber in West Hollywood using a mailing service as his addressA plumber in Los Angeles using the address of a mailbox storeA locksmith in San Diego using a UPS store‘s address

You would think that Google could see this pretty easily, since the mailing services are using the exact same address on their listing on Google Maps.

2. Virtual offices

The number of fake listings I see on a regular basis using virtual offices is enormous, and these listings seem to go completely undetected by Google. I’m not sure I’ve ever witnessed a single case where listings got automatically flagged for using a virtual office service. Even when they are reported, these types of listings often get reinstated by the Google My Business support team.

3. Multiple listings that are set up using employees’ home addresses

This is a common pattern I’ve seen on one business that has multiple listings. I had to argue with Google for months last year to get them to remove a listing for a marketing company in Toronto that set up multiple listings in multiple cities using the home addresses of their employees. They kept stating that the listings were legit, even after I called the business and asked for directions to their fake location (the girl on the phone was very confused) to confirm they didn’t exist there.

4. Keyword stuffing in business names

This has been a spamming strategy that has worked well ever since Google Places first came into existence. There currently is no penalty in place for businesses that decide to add extra words to their business name to help with their ranking. Instead, they reap a ranking benefit until someone reports them.

When reported, Google might actually remove the keywords from the title, but then the business owner can just go add them back via the Google My Business dashboard.

5. Review spam

The number of fake reviews I’ve seen reported on the forums seems to be on the rise. It’s not just business owners who pay for fake positive reviews, but also people who leave negative reviews for their competitors, like the case that Casey Meraz looked at here.

6. Photo spam

Spammers will upload tons of photos highlighting their services on the listings of their competitors. Tim Capper lists some examples here.

Final thoughts

I was happy to see Google release information that shows they are aware of some of the issues around Google Maps spam and are doing things to help fix it. However, until they decide to invest more in training and educating their support staff on spam, it seems that the users who attempt to report the spam that the algorithms missed will continue to feel like they are talking to a brick wall.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Why Google shutting down Map Maker should terrify SMBs

Google’s Map Maker has often received bad press due to the amount of spam that originates from users of the product. In May of 2015, Map Maker was actually shut down to help prevent disasters like this one. So Google’s announcement that they’re shutting down Map Maker entirely in March of 2017 made a lot of people really happy.

It’s the end of spam, right? We should all be celebrating, right?

Nothing could be further from the truth.

Allowing spammers to hide rather than vanquishing them

The truth is that users — some with less-than-virtuous motives — will still be able to make edits to business listings, just like before. Instead of doing it on Map Maker, however, they will do it through Google Maps by pressing “suggest an edit” on the listing.

Spammers know this, of course, and have already shifted the majority of their edits to Google Maps because it hides their activity from the general public.

Currently, when a user makes an edit to a listing on Map Maker, the edit history on the listing shows their username, and you can see what other edits they’ve made.

This feature of Map Maker makes it much easier for power Maps users and marketing companies to chase down spammers and report them to Google (there is an option to report the user, as well).

With Google Maps, however, the users are completely anonymous, and there is no way for anyone, other than Google, to see what types of edits they have made in the past and who else they may have sabotaged.

Why this is a serious problem

Here are a few recent examples.

1. In case you didn’t hear, Maps users recently were able to successfully change a bunch of listings for Trump hotels and cafes to list them as “Dump Tower” and “Dump Cafe.”

At one point, the category for the hotel was also changed from luxury hotel to a garbage collection service. The edits were made by users through Google Maps, and the only way to see them was while they were pending on the Google Maps app.

I took several screen shots after I saw the article on Twitter, and as you can see, the edits show absolutely no details about the user who made them.

There is no way for the person looking at the edit to tell what else that user is doing or what their intentions might be.

2. The insurance industry is one that gets a lot of spam edits. Because these edits are now going through Google Maps, the mapping community is completely blind to them, and apparently, Google isn’t taking much note of them, either.

Back in Mid-November, someone managed to update over 59 listings for various insurance companies to change the listed phone number to this one: 800-701-5909. When you call it, this phone number goes to a lead provider so that they can capture the user’s information and then turn around and sell that lead to an insurance office.

Here is a screen shot showing some of the various listings that still had that phone number on them three weeks later (December 8).

The business owners were not alerted of this change, so most of them are completely unaware that all their customers and potential customers are now calling someone else. Who made this change? There is no way for anyone (other than Google) to know. Google Maps shows no edit history log whatsoever, and Map Maker is classifying this user as “Google” (edits via Google Maps on desktop show up this way in Map Maker).

Although I can tell the business owner to change the information back via the Google My Business dashboard, there is nothing stopping it from happening again, and unless the business wants to check their information daily, there is currently no way that they will know about changes.

I have already contacted a few of these businesses to make them aware, but the problem is still that the Maps user is out there unpunished and most likely making more edits, since there is no way for businesses to identify him or her and alert Google about the username.

Long-time spammers have shifted their efforts from creating dozens of fake listings (which is becoming harder to do) to reporting legitimate businesses as spam and having them removed.

Here is a recent example of a legitimate locksmith in Florida who had his listing removed from Google because a user (most likely a competitor) reported him as spam. Neither the locksmith nor I have any clue who did it or what will prevent them from doing it again. His listing was down for about a week while he was waiting for Google to reinstate it.

So, while I wish I could look forward to the prospect that spam on Google will diminish with the removal of Map Maker, I’m actually seeing the exact opposite occur.

The number of threads on the Google My Business forum related to legit listings being changed or taken down is on the rise and will most likely continue to increase until there is more transparency. After all, what would spammers like better than to be able to hide all their activity? My current advice to SMBs is to keep a close eye on their Google listing to make sure no one is trying to sabotage it.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

How to avoid an outbound link penalty from Google

Most of us are aware of link penalties that occur if you have low-quality or spam links pointing to your site. But did you know you can also be penalized by Google for how you link to other websites from your site? Yup, you sure can. It’s called an “unnatural outbound links” penalty, and similar to the inbound link penalty, it can be applied partially or sitewide.

Recently, we conducted an audit for a new client, and we flagged the spammy linking that was being done in a particular section of their site. The content manager was unknowingly allowing guest bloggers to submit content to be published with links pointing back to their sites. This content contained a high volume of links and overoptimized anchor text.

Our recommendations to remove these links were ignored and not seen as high-priority, despite our efforts to convey the severity of this issue.

Then Google released the Penguin 4 real-time update. Soon after, our client’s site was flagged for a manual penalty. Below are screen shots from Google Search Console outlining the manual outbound link penalty for partial site matches.

How to avoid an outbound link penalty

Here are some tips to help you avoid an outbound link penalty:

Avoid linking to spam and low-quality websites.Nofollow links in user-generated content by default, or simply don’t allow them.Don’t allow any links within guest post content that is published on your site, unless someone on your staff has manually reviewed and approved the links.Do not link to sites that are providing you with some type of compensation for doing so, such as money, goods for services or reciprocal links.Train your site’s content managers to be aware of who and what they are linking to. Reference Google’s Link Scheme resource page.

What to do if you’re penalized

If you’ve received an outbound linking penalty, you should take the following steps to facilitate a resolution:

Identify the links on your site that are pointing to external websites. You can use a tool such as Screaming Frog (External Report) to identify outbound links.Audit these links to identify the ones that do not meet Google’s guidelines.Remove the problematic links, or add a nofollow tag to the links so they do not pass PageRank.


<a href="signin.php" rel="nofollow">sign in</a>

Submit a reconsideration request in Google Search Console. Outline what changes were done on your site to remove the link issues. Be as detailed as possible, and outline what steps you have put into place to prevent this from happening again.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.