Optimizing for Hanukkah: Sometimes it’s still strings, not things

My wife came to me with a problem. She wanted festive, whimsical, and potentially matching Hanukkah pajamas. But there weren’t enough options coming up in Google under one spelling of the holiday’s name, so she told me she was systematically going through all spellings to compile her list of shopping items.

I was pretty surprised by this — I had expected Google to be smart enough to recognize that these were alternative spellings of the same thing, especially post-Hummingbird. Clearly, this was not the case.

Some background for those who don’t know: Hanukkah is actually a transliterated word from Hebrew. Since Hebrew has its own alphabet, there are numerous spellings that one can use to reference it: Hanukkah, Chanukah, and Channukah are all acceptable spellings of the same holiday.

So, when someone searches for “Hanukkah pajamas” or “Chanukah pajamas,” Google really should be smart enough to understand that they are different spellings of the same concept and provide nearly identical results. But Google does not! I imagine this happens for other holidays and names from other cultures, and I’d be curious to know if other readers experience the same problem with those.

Why am I surprised that Google is returning different results for different spellings? Well, with the introduction of the Knowledge Graph (and Hummingbird), Google signaled a change for SEO. More than ever before, we could start thinking about search queries not merely as keyword strings, but as interrelated real-world concepts.

What do I mean by this?

When someone searches for “Abraham Lincoln,” they’re more than likely searching for the entity representing the 16th president of the United States, rather than the appearance of the words “Abraham” and “Lincoln,” or their uncle, also named Abraham Lincoln. And if they search for “Lincoln party,” Google knows we’re likely discussing political parties, rather than parties in the town of Lincoln, Mass., because this is a concept in close association with the historical entity Abraham Lincoln.

Similarly, Google is certainly capable of understanding that when we use the keyword Hanukkah, it is in reference to the holiday entity and that the various spellings are also referring to the same entity. Despite different spellings, the different searches actually mean the same thing. But alas, as demonstrated by my wife’s need to run a different search for each spelling of the holiday in order to discover all of her Hanukkah pajama options, Google wasn’t doing the best job.

So, how widespread is the Chanukah/Hanukkah/Chanukkah search problem? Here are a couple of search results for Chanukah items:

As you can see from the first screen shot, some big box retailers like Target, Macy’s and JCPenney rank on page one of Google. In screen shot two, however, they are largely absent — and sites like PajamaGram and Etsy are dominating the different spelling’s SERP.

This means that stores targeting the already small demographic of Hanukkah shoppers are actually reducing the number of potential customers by only using one spelling on their page. (Indeed, according to my keyword tool of choice, although “Hanukkah” has the highest search volume of all variants at 301,100 global monthly searches, all other spellings combined still make up a sizeable 55,500 searches — meaning that retailers optimizing for both terms could be seeing 18 percent more traffic.)

Investigating spelling variations and observations

Since I’m an ever-curious person, I wanted to investigate this phenomenon a little further.

I built a small, simple tool to show how similar the search engine results pages (SERP) for two different queries are by examining which listings appear in both SERPs. If we look at five common spellings of Hanukkah, we see the following:

Keyword 1Keyword 2SERP SimilarityChannukahChanukah90.00%ChannukahHannukah20.00%ChannukahHannukkah20.00%ChannukahHanukkah30.00%ChanukahHannukah20.00%ChanukahHannukkah20.00%ChanukahHanukkah30.00%HannukahHannukkah90.00%HannukahHanukkah80.00%HannukkahHanukkah80.00%

The tool shows something quite interesting here: Not only are the results different, but depending on spelling, the results may only be 20 percent identical, meaning eight out of 10 of the listings on page one are completely different.

I then became curious about why the terms weren’t canonicalized to each other, so I looked at Wikidata, one of the primary data sources that Google uses for its Knowledge Graph. As it turns out, there is an entity with all of the variants accounted for:

I then checked the Google Knowledge Graph Search API, and it became very clear that Google may be confused:

KeywordresultScore@idnameDescription@typeChannukah8.081924kg:/m/0vpq52Channukah LoveSong by Ju-Tang[MusicRecording, Thing]Chanukah16.334606kg:/m/06xmqp_A Rugrats Chanukah?[Thing]Hannukah11.404715kg:/m/0zvjvwtHannukahSong by Lorna[MusicRecording, Thing]Hannukkah11.599854kg:/m/06vrjy9HannukkahBook by Jennifer Blizin Gillis[Book, Thing]Hanukkah21.56493kg:/m/02873zHanukkah HarryFictional character[Thing]

The resultScore values — which, according to the API documentation, indicate “how well the entity matched the request constraints” — are very low. In this case, the entity wasn’t very well matched. This would be consistent with the varying results if it weren’t for the fact that a Knowledge Graph is being returned for all of the spelling variants with the Freebase ID /m/022w4 — different from what is returned from the Knowledge Graph API. So, in this case, it seems that the API may not be a reliable means of assessing the problem. Let’s move on to some other observations.

It is interesting to note was that when searching for Channukah, Google pushed users to Chanukah results. When searching Hannukah and Hannukkah, Google pushed users to Hanukkah results. So, Google does seem to group Hanukkah spellings together based on whether they start with an “H” or a “Ch.”

Chanukah, Hannukah, and Hanukkah were also the only variations that received the special treatment of the Hanukkah menorah graphic:

What a retailer selling Hanukkah products should do

Clearly, if we want full coverage of terms (and my wife to find your Hanukkah pajamas), we cannot rely on just optimizing for the highest search volume variation of the keyword, as Google doesn’t seem to view all variants as entirely the same. Your best bet is to include the actual string for each spelling variant somewhere on the page, rather than relying on Google to understand them as variations of the same thing.

If you’re a smaller player, it may make sense to prioritize optimizations toward one of the less popular spelling variants, as the organic competition may not be as significant. (Of course, this does not bar you from using spelling variants in addition to that for the potential of winning for multiple spellings.)

At a bare minimum, you may opt to include a spelling beginning with H- and Ch- and hope that Google will direct users to the same SERP in most cases.

Future experiment

I started an experiment to see whether the inclusion of structured data with sameAs properties may be a potential avenue for getting Google to understand a single spelling as an entity, eliminating the need to include different spelling variations. As of now, it’s a little too early to know the results of the test, and they are inconclusive, but I look forward to sharing those results in the future.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

SEO in 2018: Optimizing for voice search

Google Webmaster Trends Analyst John Mueller recently asked for feedback on why webmasters are looking for Google to separate voice search queries in Search Console. If you, like me, want to see voice searches in Google Search Console, definitely submit your feedback on Twitter as John requested.

I hear folks asking about voice search data in Search Console often. Can you elaborate on what you want to see there? What's an example of such a query that would be useful? pic.twitter.com/WOqS7aH4tP

— John ☆.o(≧▽≦)o.☆ (@JohnMu) December 7, 2017

I lived through the very beginnings of mobile SEO, where many people thought mobile search behavior would be completely different from desktop search behavior only to find that much of it is the same. So I see why Mueller and others don’t necessarily understand why Search Console users would want to see voice queries separately. Some queries are the same whether they’re typed into a computer at a desktop or spoken across the room to a Google Home.

That being said, there are some very good reasons to want voice search data. Optimizing for voice search requires some slightly different tactics from those for traditional SEO, and having insight into these queries could help you provide a better experience for those searching by voice.

Not convinced you should care about voice search? Here are three reasons I think you should:

1. More visibility on featured snippets

One of the interesting things about Google Home is that when it answers a question with information from the web, it will cite the source of the information by saying the website’s name, and it will often send a link to the searcher’s Google Home app.

Currently, Google Home and Google Assistant read snippets from sites that are ranked in “position zero” and have been granted a featured snippet. This is why more people than ever are talking about how to optimize for featured snippets. If you look at the articles published on the topic (according to what Google has indexed), you’ll see that the number of articles about how to optimize for featured snippets has grown 178 percent in the past year:

Understanding voice search queries could help us better understand the types of queries that surface featured snippets. As marketers, we could then devote time and resources to providing the best answer for the most common featured snippets in hopes of getting promoted to position zero.

This helps marketers drive credibility to their brand when Google reads their best answer to the searcher, potentially driving traffic to the site from the Google Home app.

And this helps Google because they benefit when featured snippets provide good answers and the searcher is satisfied with the Google Home results. The better the service, the more consumers will use it — and potentially buy more Google Home units or Android phones because they think the service is worthwhile.

If bad featured snippets are found because no one is trying to optimize for those queries, or no featured snippets are found and the Google Home unit must apologize for not being able to help with that query yet, Google potentially loses market share to Amazon in the smart speaker race and Apple in the personal assistant race.

So this one is a win-win, Google. You need more great responses competing for position zero, and we want to help. But first, we need to know what types of queries commonly trigger featured snippets from voice search, and that’s why we need this data in Search Console today.

2. Better way to meet consumer demand and query intent based on context

We saw two major things happen in the early days of mobile SEO when we compared desktop and mobile queries:

    Searchers often used the same keywords in mobile search that they did in desktop search; however, certain keywords were used much more often on mobile search than desktop search (and vice versa).Whole new categories of queries emerged as searchers realized that GPS and other features of mobile search could allow them to use queries that just didn’t work in desktop search.

An example of the first point is a query like “store hours,” which peaks in volume when shoppers are headed to stores:

An example of the second is “near me” queries, which have grown dramatically with mobile search and mostly occur on mobile phones:

The mode of search therefore changes search behavior as searchers understand what types of searches work well on mobile but not on desktop.

Consider this in the context of voice search. There are certain types of queries that only work on Google Home and Google Assistant. “Tell me about my day” is one. We can guess some of the others, but if we had voice search data labeled, we wouldn’t have to.

How would this be useful to marketers and site owners? Well, it’s hard to say exactly without looking at the data, but consider the context in which someone might use voice search: driving to the mall to get a present for the holidays or asking Google Home if a store down the street is still open. Does the searcher still say, “Holiday Hut store hours?” Or do they say something like, “OK Google, give me the store hours for the Holiday hut at the local mall?” Or even, “How late is Holiday Hut open?”

Google should consider all these queries synonymous in this case, but in some cases, there could be significant differences between voice search behavior and typed search behavior that will affect how a site owner optimizes a page.

Google has told us that voice searches are different, in that they’re 30 times more likely to be action queries than typed searches. In many cases, these won’t be actionable to marketers — but in some cases, they will be. And in order to properly alter our content to connect with searchers, we’ll first need to understand the differences.

In my initial look at how my own family searched on Google Home, I found significant differences between what my family asked Home and what I ask my smartphone, so there’s reason to believe that there are new query categories in voice search that would be relevant to marketers. We know that there are queries — like “Hey Google, talk to Dustin from Stranger Things” and “Buy Lacroix Sparkling Water from Target” — that are going to give completely different results in voice search on Google Home and Assistant from the results in traditional search. And these queries, like “store hours” queries, are likely to be searched much more on voice search than in traditional search.

The problem is, how do we find that “near me” of voice search if we don’t have the data?

3. Understanding extent of advertising and optimization potential for new voice-based media

The last reason to pay attention to voice search queries is probably the most important — for both marketers and Google.

Let me illustrate it in very direct terms, as it’s not just an issue that I believe marketers have in general, but one that affects me personally as well.

Recently, one of my company’s competitors released survey information that suggested people really want to buy tickets through smart speakers.

As a marketer and SEO who sells tickets, I can take this information and invest in Actions on Google Development and marketing so that our customers can say, “OK Google, talk to Vivid Seats about buying Super Bowl tickets,” and get something from Google Home other than, “I’m sorry but I don’t know how to help with that yet.” (Disclosure: Vivid Seats is my employer.)

Or maybe I could convince my company to invest resources in custom content, as Disney and Netflix have done with Google. But am I really going to do it based on this one data point? Probably not.

As with mobile search in 2005, we don’t know how many people are using voice search in Google Home and Google Assistant yet, so we can’t yet know how big the opportunity is or how fast it’s growing. Voice search is in the “innovators and early adopters” stage of the technology adoption life cycle, and any optimizations done for it are not likely to reach a mainstream audience just yet. Since we don’t have data to the contrary from Google or Amazon, we’ll have to stay with this assumption and invest at a later date, when the impact of this technology on the market will likely mean a significant return on our investment.

If we had that data from Google, I would be able to use it to make a stronger case for early adoption and investment than just using survey data alone. For example, I would be able to say to the executives, “Look how many people are searching for branded queries in voice search and getting zero results! By investing resources in creating a prototype for Google Home and Assistant search, we can satisfy navigational queries that are currently going nowhere and recoup our investment.” Instead, because we don’t have that data from Google, the business case isn’t nearly as strong.

Google has yet to monetize voice search in any meaningful way, but when advertising appears on Google Home, this type of analysis will become even more essential.

Final thoughts

Yes, we can do optimization without knowing which queries are voice search queries, as we could do mobile optimization without knowing which queries are mobile queries; yet understanding the nuances of voice search will help Google and marketers do a better job of helping searchers find exactly what they’re looking for when they’re asking for it by voice.

If you agree, please submit your feedback to John Mueller on Twitter.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Visualizing your site structure in advance of a major change

In our last article, we looked at some interesting ways to visualize your website structure to illuminate how external links and PageRank flow through it. This time, we’re going to use the same tools, but we’re going to look instead at how a major site structure change might impact your site.

Search engine crawlers can determine which pages on your site are the most important, based, in part, on how your internal links are structured and organized. Pages that have a lot of internal links pointing to them — including links from the site’s navigation — are generally considered to be your most important pages. Though these are not always your highest-ranking pages, high internal PageRank often correlates with better search engine visibility.

Note: I use the phrase “internal PageRank,” coined by Paul Shapiro, to refer to the relative importance of each page within a single website based on that site’s internal linking structure. This term may be used interchangeably with “page weight.”

The technique I’ll outline below can be used to consider how internal PageRank will be impacted by the addition of new sections, major changes to global site navigation (as we’ll see below) and most major changes to site structure or internal linking.

Understanding how any major change to a site could potentially impact its search visibility is paramount to determining the risk vs. reward of its implementation. This is one of the techniques I’ve found most helpful in such situations, as it provides numbers we can reference to understand if (and how) page weight will be impacted by a structural adjustment.

In the example below, we’re going to assume you have access to a staging server, and that on that server you will host a copy of your site with the considered adjustments. In the absence of such a server, you can edit the spreadsheets manually to reflect the changes being considered. (However, to save time, it’s probably worth setting up a secondary hosting account for the tests and development.)

It’s worth noting that on the staging server, one need only mimic the structure and not the final design or content. Example: For a site that I’m working on, I considered removing a block of links in a drop-down from the global site navigation and replacing that block of links with a single text link. That link would go to a page containing the links that were previously in the drop-down menu.

When I implemented this site structure change on the staging server, I didn’t worry about whether any of this looked good — I simply created a new page with a big list of text links, removed all the links from the navigation drop-down, and replaced the drop-down with a single link to the new page.

I would never put this live, obviously — but my changes on the staging server mimic the site structure change being considered, giving me insight into what will happen to the internal PageRank distribution (as we’ll see below). I’ll leave it to the designers to make it look good.

For this process, we’re going to need three tools:

    Screaming Frog — The free version will do if your site is under 500 pages or you just want a rough idea of what the changes will mean.Gephi — A free, powerful data visualization tool.Google Analytics

So, let’s dive in…

Collecting your data

I don’t want to be redundant, so I’ll spare you re-reading about how to crawl and export your site data using Screaming Frog. If you missed the last piece, which explains this process in detail, you can find it here.

Once the crawl is complete and you have your site data, you need simply export the relevant data as follows:

Bulk Export > Response Codes > Success (2xx) Inlinks

You will do this for both your live site and your staging site (the one with the adjusted structure). Once you have downloaded both structures, you’ll need to format them for Gephi. All that Gephi needs to create a visualization is an understanding of your site pages (“nodes”) and the links between them (“edges”).

Note: Before we ready the data, I recommend doing a Find & Replace in the staging CSV file and replacing your staging server domain/IP with that of your actual site. This will make it easier to use and understand in future steps.

As Gephi doesn’t need a lot of the data from the Screaming Frog export, we’ll want to strip out what’s not necessary from these CSV files by doing the following:

Delete the first row containing “Success (2xx) Inlinks.”Rename the “Destination” column “Target.”Delete all other columns besides “Source” and “Target.” (Note: Before deleting it, you may want to do a quick Sort by the Type column and remove anything that isn’t labeled as “AHREF” — CSS, JS, IMG and so on — to avoid contaminating your visualization.)Save the edited file. You can name it whatever you’d like. I tend to use domain-live.csv and domain-staging.csv.

The third set of data we’ll want to have is an Export of our organic landing pages from Google Analytics. You can use different metrics, but I’ve found it extremely helpful to have a visual of which pages are most responsible for my organic traffic when considering the impact of a structural change on page weight. Essentially, if you find that a page responsible for a good deal of your traffic will suffer a reduction in internal PageRank, you will want to know this and adjust accordingly.

To get this information into the graph, simply log into Google Analytics, and in the left-hand navigation under “Behavior,” go to “Site Content” and select “Landing Pages.” In your segments at the top of the page, remove “All Users” and replace it with “Organic Traffic.” This will restrict your landing page data to only your organic visitors.

Expand the data to include as many rows as you’d like (up to 5,000) and then Export your data to a CSV, which will give you something like:

Remove the first six rows so your heading row begins with the “Landing Page” label. Then, scroll to the bottom and remove the accumulated totals (the last row below the pages), as well as the “Day Index” and “Sessions” data.

Note that you’ll need the Landing Page URLs in this spreadsheet to be in the same format as the Source URLs in your Screaming Frog CSV files. In the example shown above, the URLs in the Landing Page column are missing the protocol (https) and subdomain (www), so I would need to use a Find & Replace to add this information.

Now we’re ready to go.

Getting a visualization of your current site

The first step is getting your current site page map uploaded — that is, letting Gephi know which pages you have and what they link to.

To begin, open Gephi and go to File > Import Spreadsheet.  You’ll select the live site Screaming Frog export (in my case, yoursite-live.csv) and make sure the “As table:” drop-down is set to “Edges table.”

On the next screen, make sure you’ve checked “Create missing nodes,” which will tell Gephi to create nodes (read: pages) for the “Edges table” (read: link map) that you’ve entered. And now you’ve got your graph. Isn’t it helpful?

OK, not really — but it will be. The next step is to get that Google Analytics data in there. So let’s head over to the Data Laboratory (among the top buttons) and do that.

First, we need to export our page data. When you’re in the Data Laboratory, make sure you’re looking at the Nodes data and Export it.

When you open the CSV, it should have the following columns:

Id (which contains your page URLs)LabelTimeset

You’ll add a fourth column with the data you want to pull in from Google Analytics, which in our case will be “Sessions.” You’ll need to temporarily add a second sheet to the CSV and name it “analytics,” where you’ll copy the data from your analytics export earlier (essentially just moving it into this Workbook).

Now, what we want to do is fill the Sessions column with the actual session data from analytics. To do this, we need a formula that will look through the node Ids in sheet one and look for the corresponding landing page URL in sheet two; when it finds it, it should insert the organic traffic sessions for that page into the Sessions column where appropriate.

Probably my most-used Excel script does the trick here. In the top cell of the “Sessions” column you created, enter the following (the bolded numbers will change based on the number of rows of data you have in your analytics export).

=IFERROR(INDEX(analytics!$B$2:$B$236,MATCH(A2,analytics!$A$2:$A$236,0),1),”0″)

Once completed, you’ll want to copy the Sessions column and use the “Paste Values” command, which will switch the cells from containing a formula to containing a value.

All that’s left now is to re-import the new sheet back into Gephi. Save the spreadsheet as something like data-laboratory-export.csv (or just nodes.csv if you prefer). Using the Import feature from in the Data Laboratory, you can re-import the file, which now includes the session data.

Now, let’s switch from the Data Laboratory tab back to the Overview tab. Presently, it looks virtually identical to what it had previously — but that’s about to change. First, let’s apply some internal PageRank. Fortunately, a PageRank feature is built right into Gephi based on the calculations of the initial Google patents. It’s not perfect, but it’s pretty good for giving you an idea of what your internal page weight flow is doing.

To accomplish this, simply click the “Run” button beside “PageRank” in the right-hand panel. You can leave all the defaults as they are.

The next thing you’ll want to do is color-code the nodes (which represent your site pages) based on the number of sessions and size them based on their PageRank. To do this, simply select the color palette for the nodes under the “Appearance” pane to the upper left. Select sessions from the drop-down and choose a palette you like. Once you’ve chosen your settings, click “Apply.”

Next, we’ll do the same for PageRank, except we’ll be adjusting size rather than color. Select the sizing tool, choose PageRank from the drop-down, and select the maximum and minimum sizes (this will be a relative sizing based on page weight). I generally start with 10 and 30, respectively, but you might want to play around with them. Once you’ve chosen your desired settings, click “Apply.”

The final step of the visualization is to select a layout in the bottom left panel. I like “Force Atlas” for this purpose, but feel free to try them all out. This gives us a picture that looks something like the following:

You can easily reference which pages have no organic traffic and which have the most based on their color — and by right-clicking them, you can view them directly in the Data Laboratory to get their internal PageRank. (In this instance, we can learn one of the highest traffic pages is a product page with a PageRank of 0.016629.) We can also see how our most-trafficked pages tend to be clustered towards the center, meaning they’re heavily linked within the site.

Now, let’s see what happens with the new structure. You’ll want to go through the same steps above, but with the Screaming Frog export from the staging server (in my case, domain-staging.csv). I’m not going to go make you read through all the same steps, but here’s what the final result looks like:

We can see that there are a lot more outliers in this version (pages that have generally been significantly reduced in their internal links). We can investigate which pages those are by right-clicking them and viewing them in the Data Laboratory, which will help us locate possible unexpected problems.

We also have the opportunity to see what happened to that high-traffic product page mentioned above. In this case, under the new structure, its internal PageRank shifted to 0.02171 — in other words, it got stronger.

There are two things that may have caused this internal PageRank increase: an increase in the number of links to the page, or a drop in the number of links to other pages.

At its core, a page can be considered as having 100 percent of its PageRank. Notwithstanding considerations like Google reduction in PageRank with each link or weighting by position on the page, PageRank flows to other pages via links, and that “link juice” is split among the links. So, if there are 10 links on a page, each will get 10 percent. If you drop the total number of links to five, then each will get 20 percent.

Again, this is a fairly simplified explanation, but these increases (or decreases) are what we want to measure to understand how a proposed site structure change will impact the internal PageRank of our most valuable organic pages.

Over in the Data Laboratory, we can also order pages by their PageRank and compare results (or just see how our current structure is working out).

And…

This is just the tip of the iceberg. We can substitute organic sessions for rankings in the page-based data we import (or go crazy and include both). With this data, we can judge what might happen to the PageRank of ranking (or up-and-coming) pages in a site structure shift. Or what about factoring in incoming link weight, as we did in the last article, to see how its passing is impacted?

While no tool or technique can give you 100 percent assurance that a structural change will always go as planned, this technique assists in catching many unexpected issues. (Remember: Look to those outliers!)

This exercise can also help surface unexpected opportunities by isolating pages that will gain page weight as a result of a proposed site structure change. You may wish to (re)optimize these pages before your site structure change goes live so you can improve their chances of getting a rankings boost.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

How on-site search can drive holiday revenue & help e-commerce sites compete against major retailers

This holiday season is set to break a new record, with online sales reaching beyond $100 billion, according to Adobe’s recent predictions. Following Black Friday and Cyber Monday outcomes, most of that revenue will be divided among Amazon and a handful of large-scale e-commerce sites, including Walmart, Target and Best Buy.

With so many dollars at stake, there is still a sizeable amount of market share available for smaller online retailers. But what can e-commerce sites do to compete with the likes of Amazon or Walmart?

An optimized on-site search platform could very well be the answer to capturing more conversions and driving more sales during the holidays. Unfortunately, many e-commerce sites may be missing the boat by not paying enough attention to their on-site search efforts.

How on-site search impacts revenue

According to SLI Systems, which offers an AI-powered e-commerce solution, visitors who use on-site search make purchases at a 2.7x greater rate than website visitors who only browse products. If searchers have indicated exactly what they want — specifying a color, size or material within their query — SLI Systems says it’s the e-commerce site’s job to quickly deliver the product that best matches their search.

“Don’t make these folks navigate their way to what they want. No extra clicks. You’ll likely lose them even if you have a great price and an amazing free shipping offer,” says Bob Angus, an e-commerce consultant, in a post on SLI System’s company blog.

Eli Finkelshteyn, founder and CEO of on-site search platform Construtor.io, says most of the of the on-site search market is still predominantly made up of companies that have built platforms in-house.

“I think there’s an erroneous belief among a lot of companies that search is really core to what they do,” says Finkelshteyn.

“At the end of the day, I think, for e-commerce websites, they’ve got things they need to build themselves, that no one can help them with — things like merchandising, making sure you have the lowest prices, quick delivery, that you have the product that customers want — but search is adjacent to that.”

Finkelshteyn says companies need to make sure their on-site search is optimized so that consumers find the products they want.

“I think that’s notoriously difficult to do,” says Finkelshteyn.

With an on-site search function, you may only be serving up a limited number of results. If a consumer is searching your site for a specific product, Finkelshteyn says it is imperative your on-site search knows how to deliver the most relevant products.

The technology driving an optimized on-site search experience

Constructor.io’s platform incorporates a number of technologies, including the integration of machine learning to improve personalized auto-suggestion results.

“Typo-tolerance is automatic with us. We do that using phonetic and typo-graphic dissonances,” says Finkelshteyn, “What that means, essentially, is that we’re mapping how a word is pronounced to the canonical word in your data set.”

For example, if someone is searching for a Kohler faucet but enters a search for Koler — they will receive the correct product match.

Finkelshteyn says another fairly common on-site search challenge is typographical misspellings — when someone simply enters a typo. An effective on-site search platform should be able to recognize common misspellings and still surface relevant products.

On-site search from a brand’s point of view

Dennis Goedegebuure serves as the VP of growth and SEO for sporting apparel company Fanatics. The company operates more than 300 online and offline partner stores. A portion of those stores handle the e-commerce business for all major professional and sports leagues.

“I work very closely with the on-site search teams to make sure the sites differentiate themselves with the offers we give our users,” says Goedegebuure.

The VP of growth says on-site search plays a crucial role in Fanatics’ e-commerce business.

“When you capture a visit, you would like to offer your customer the best selection. So making sure they get the best selection at the best price for the best value to make the sale is obviously top priority,” says Goedegebuure.

According to Goedegebuure, it’s not only about product competition, but the competition among online retailers for share of wallet.

“The customer only has a certain amount of money to spend, you would like to make sure they spend it with you.”

Goedegebuure’s teams are constantly running tests to fine-tune their sites’ on-site search functions.

“We’re running a bunch of experiments all the time, from sizing of the pictures to the little icons that we add to the search, to sort-order, to the number of items in the search result page,” says Goedegebuure. “We’re running constant experiments to find an optimal configuration of our search and to improve the conversions we get out of the traffic.”

According to Goedegebuure, the on-site search tests his teams are running have helped identify a definite sweet-spot for the number of items displayed in search results, as well as determining how the sizing of a picture can impact conversion rates.

On-site search for the holidays

In terms of holiday preparation, Goedegebuure says Fanatics on-site search algorithms may be tweaked to align with holiday promotions.

“If we have a brand on sale — like our own Fanatics brand — these might be pushed up to the top because there are better pricing points,” says Goedegebuure, “If an item goes off sale, you need to adjust for that.”

Finkelshteyn says one of the major on-site search mistakes he sees companies making this time of year is failing to refresh their index rankings.

“If you have a search index with rankings you’ve built over the last year, you still might be optimizing for searches that are not really seasonal right now,” says Finkelshteyn, “For example, if somebody searches for the word ‘blanket’ during the summer, you probably want to give them a beach blanket. If somebody searches for the word ‘blanket’ during the winter, you probably want to give them a warm blanket.”

Whether your company has built its on-site search platform in-house or is using a vendor platform, making sure it is optimized for the holiday e-commerce surge should be a top priority. As we enter the final days of the shopping season, there is still much revenue up for grabs.

Adobe’s latest reports found that holiday e-commerce had reached $50 billion by the end of November, leaving more than $50 billion of its predicted $100 billion in revenue to be claimed by the year’s end.

For many e-commerce companies, fine-tuning their on-site search algorithms may be the most profitable move they could make this holiday season — and beyond.

[This article first appeared on Marketing Land.]

Google revamps its SEO Starter Guide

Google announced that it has retired the old PDF version of the SEO Starter Guide originally released in 2008, over nine years ago, with a new web-based version of the guide.

The last time Google updated this guide was several years ago.

The new guide merges the Webmaster Academy and the old SEO Starter Guide PDF into this one resource section. “The updated version builds on top of the previously available document, and has additional sections on the need for search engine optimization, adding structured data markup and building mobile-friendly websites,” Google said.

It is also currently available in nine different languages, including English, German, Spanish, French, Italian, Japanese, Portuguese, Russian and Turkish.

The new SEO guide can now be accessed online over here.

A balanced approach to data-driven SEO

We have nearly unlimited access to information and data. For search marketers, this can be a blessing or a curse. It’s very easy to get sucked into the never-ending pool of data — but this rarely, if ever, benefits our work. So how do we protect ourselves from information overload?

Futurologist Alvin Toffler predicted in 1970 that the rapidly increasing amounts of information produced would eventually cause people problems. More than a few times, I’ve found myself overwhelmed and overloaded with information, and my guess is that you have also experienced this phenomenon.

If you take your SEO seriously, then you understand the necessity of tracking your efforts — after all, data is at the core of good SEO.

Management thinker Peter Drucker is often credited as saying, “You can’t manage what you can’t measure.” While I agree completely with the statement, it seems as though some SEOs have resorted to just measuring everything, which is simply not practical. If we are going to be effective, we must have focus and be clear on what we want to track and why we want to track it.

Knowledge vs. information

One of the main causes for information overload, in my opinion, is that SEOs and marketers have confused information and knowledge. It’s as if we believe that if we get more information, we will eventually uncover some knowledge. According to a March 2017 report from the CMO Council, “Empowering the Data-Driven Customer Strategy”:

In the past five years, 42 percent of marketers have installed more than 10 individual solutions across marketing, data, analytics or customer engagement technologies, and 9 percent have brought on more than 20 individual tools or solutions.

While this sounds like marketers are tracking more users at a deeper level, this next stat from the same report paints a different picture:

In that same five-year period, 44 percent of marketers have indicated they have spent more than 25 percent of their marketing budgets to replace existing technologies.

While most are collecting increasing amounts of information, the “turnover” of martech solutions suggests that they have yet to find the knowledge they are really seeking.

So, what’s the difference between knowledge and information?

In this case, information is the data itself — the “facts” in raw form. Knowledge, on the other hand, is derived through carefully analyzing this data to really understand what is happening within your accounts.

Collecting data, for the most part, is free. You install some JavaScript tracking code and sit back and wait. Information on its own provides little value. The real value lies in collecting the right data, then analyzing it with an intent to transform it into knowledge.

The SEO implications

The job of a search professional is to increase the visibility of a brand or website in search and to attract qualified visitors that have the potential to engage. The art and science of SEO can be very complicated. But a quote I recently heard from author and life coach Tony Robbins made me pause and rethink a number of my approaches: “Complexity is the enemy of execution.”

When we bury ourselves in data and overcomplicate our processes, we end up compromising our execution.

What’s the solution?

Now, I am not saying to “dumb down” your approach or ignore your data. What I am advocating for is more focus on what actually matters. Every campaign has its unique challenges and circumstances. In order to uncover actionable knowledge that you can use to improve your campaign’s effectiveness, you must have a clear goal.

Your goals will dictate what you track, how you track and why you track. Use your goals as a guidepost to keep you focused. By having this focus, you will be able to do the work that matters.

Now, it will be tempting to veer off-course and peek at all the other data you may be collecting. And if you must, then set up a limited amount of time to do so. But be careful that you don’t get sucked into the information overload trap.

Here are a few more tips to ensure you keep your head above all the data:

Spend your time collecting and analyzing data that is on a need-to-know basis rather than a nice-to-know basis.Focus on quality of information collected, rather than quantity.Don’t multitask. Single-tasking keeps your mind focused on what is most important.

The digital age has given us access to insights our predecessors could not comprehend in their wildest dreams. As search professionals, we are continually interacting with significant amounts of data that can quickly overwhelm us. If we are going to be able to drive results and add value, we must learn how to overcome information overload and focus on what really matters.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

5 local search tactics your competitors probably aren't using

Local SEO is competitive and fierce. With more and more local businesses vying for the Google local three-pack — and ads and online directories occupying a large percentage of the remaining SERP real estate — your local SEO strategy has to be aggressive.

So, what can you do to outrank your local competitors down the street, especially when you’ve all got the basics down? One approach is to use local SEO tactics that your competitors may not know about or aren’t using. Here are five local SEO tactics you can implement to help get ahead of your competitors.

Google Posts

First, every local business should claim their Google My Business (GMB) listing. It’s a must-do. Non-negotiable. If you don’t claim your Google My Business listing, you essentially don’t exist online! (Okay, that’s an exaggeration — but not claiming your GMB listing will significantly diminish your chances of showing up in local search results.)

Of your competitors who claim their Google My Business listing, most will just walk away and forget about it. However, claiming your listing and letting it sit there gathering dust is like purchasing a new home and not putting any furniture in it. There’s so much more you should do, and this is one way you can outsmart (and outrank) your competitors.

Google has insight into how you and your potential customers are engaging with your Google My Business listing — and generally speaking, the more activity, the better. Does someone use the click-to-call option on their smartphone? Is a potential customer asking you a question using the new Q&A feature? Did you answer the question? Are you updating your business hours for holidays? Are you uploading quality photos of your business or staff?

And are you utilizing Google Posts?

Google Posts are almost like mini-ads with a picture, description, offer, landing page URL and so on. You can create Posts that tell potential customers about a product or service, promote upcoming specials, offer holiday wishes, let customers know about an event you’re having, and more. Having an open house? Create a Post for that event. Offering a free report or white paper? Create a Post about that white paper and add the link to where people can go to download it.

Creating a Post is easy. Simply log in to your Google My Business dashboard, and to the left, you will see the Posts option. Click on it to get started creating your first Post!

Whether you’re creating a post about an upcoming event, sale, special offer, product or service, try to include keywords relevant to your business and city in the copy of the post. (It can’t hurt!) Make your post compelling so that people who see your GMB listing will want to click on the Post to learn more. (Remember, Google is watching those interactions!)

Once you’ve created your post, here’s how it will look on your Google My Business Listing:

To make sure that the Posts are timely, Google removes Posts after seven days (or, if you set up an event, the Post will be removed when the event date has passed). To keep you on your toes, Google will even send you email reminders when it’s time to create a fresh new Post.

Does creating a Google Post help your local rankings? The verdict’s not 100 percent in, but Joy Hawkins and Steady Demand did some research, and they found that Google Posts did appear to have a mild impact on rankings.

Check your Google My Business category

Speaking of Google My Business, selecting the best GMB category for your business can make a huge difference in how your business ranks on Google. If you find your competitors are leapfrogging ahead of you on the local three-pack, scope out what category their business is listed under — you may want to experiment with selecting that same category.

If matching your competitors’ categories doesn’t move the needle for you, try getting more granular. (Yes, this is a case of trial and error. You may need to test until you find the right category that will get you better visibility and/or more qualified leads.) See the example below, where one of my clients jumped up on Google rankings when we changed her category from the more general “Lawyer” category to a more specific category, “Family Law Attorney.”

It’s always best to choose the category that most accurately fits your business type. Sometimes, people select too many categories, which can “dilute” your specialty. Selecting the best category for your business is a strategy that may mean you fall before you rise — but once you find the “sweet spot,” you can outrank your competitors.

Apply URL best practices

URLs are an important part of your search engine optimization and user experience strategy. Not only do URLs tell your site’s visitors and search engines what a page is about, they also serve as guides for the structure of your website. Your URLs should be descriptive, user-friendly and concise. When appropriate, include keywords (like your city, the name of a product, the type of service and so on) in the URL.

If your website runs on a CMS, you may have to adjust the settings to ensure that your page URLs are SEO-friendly. For example, WordPress URLs have a default format of /?p=id-number, which does not adhere to SEO best practices and is not particularly user-friendly.

To fix this issue, you need to create a custom URL structure for your permalinks and archives. This improves the look of the URL for visitors and people that share your link, and it also allows you to add relevant and local keywords to a page’s URL.

To fix this WordPress default setting, log in to your WordPress dashboard and go to Settings and click on Permalinks:

There you will be able to change your setting to “Post Name.” Changing this setting will allow you to create SEO-friendly URLs like:

https://websitename.com/products/blue-widgets

Please note that after you change the permalink structure on your website, you may need to create redirects from the old URLs to the new ones (assuming your CMS doesn’t do this automatically).

Make your site secure

If your site isn’t secure (i.e., not HTTPS), making it secure is something you should add to your to-do list. In January 2017, Google started showing “not secure” warnings for Chrome users on HTTP pages with a password or credit card field. And, as of October 2017, they’ve expanded this warning to display when users enter data on an HTTP page, and on all HTTP pages visited in Incognito mode.

Even worse, their goal is to eventually display this warning on all HTTP pages. With all the press about cyber-security and protecting your personal information online, seeing this “Not Secure” warning on your site could scare off potential customers. Google is essentially warning people not to visit your site. Since many people are apt to close a website if they see a security warning, that means you could be losing a lot of business.

The bottom line: If your site’s not secure, you could be losing business to competitors.

(For a primer on making the switch from HTTP to HTTPS, check out this guide by Patrick Stox: “HTTP to HTTPS: An SEO’s guide to securing a website.”)

There are immediate benefits to having a secure site, too. If you have a secure site, the https:// and the green locked padlock that appear next to your URL in Chrome will make your website seem more trustworthy than a competitor’s site that isn’t secure.

And, of course, Google has stated that secure sites receive a slight rankings boost. Though this boost is fairly minor, it could still give you an edge over a competitor, all else being equal.

Write quality content: End writer’s block

Not only does Google like fresh, relevant, high-quality content — your site visitors do, too.

When it comes to writing long-form content, however, some people freeze up with writer’s block. How can you determine what to write about in order to satisfy users and drive relevant traffic?

Rest easy. There are amazing tools out there that can help you find the most popular questions people ask about a particular topic, and these types of questions and answers make for great content fodder.

Each of these tools has a different algorithm they use to find popular questions that need answering, but many pull top-asked questions from Google, various user forums, Quora, e-commerce sites and more. Finding these questions and writing a piece of content that answers those questions can squash writer’s block — fast! Now you can write content that actually answers questions potential customers are really asking.

Here are just a few of the “content crushing” tools I use:

Question SamuraStorybaseAnswer the PublicBuzzSumo Question AnalyzerBlogSearchEngine.orgHubSpot’s Blog Topic Generator

Which local SEO tactics are YOU using to beat your competition?

I’d love to know what local tactics are giving you a competitive edge in rankings. Are you using any of the tactics above? Different ones? Let us know!

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

The importance of targeting branded searches

Search experts understand the importance of targeting non-branded search terms: Optimizing for high-volume, non-branded terms can drive a significant amount of traffic to your brand’s site.

While targeting non-branded search terms plays an essential role in your overall search strategy, many brands still underestimate and neglect the power of branded search terms. By relying on the strength of a brand and integrating branded search tactics with current non-branded search strategies, your business can discover more qualified leads — and, as a result, increase conversions.

The role of branded search in consumer behavior

For many established companies, their branded terms make up a majority of their keyword profile. If people are searching for your brand or products by name, they’re likely deeper within the sales funnel. In fact, Google has found that branded keywords have over two times higher conversion rates when compared to non-branded terms. So why would brands shy away from increasing or stepping up their branded search efforts?

Let’s flip the script and put you in the customer’s shoes. Say you’re searching for a fitness tracker your brother would love this holiday season. When you begin your gift hunt, are you more likely to search for “best fitness tracker for men,” or for “best Fitbit for men”?

Data from Google AdWords Keyword Planner

Due to the Fitbit’s brand awareness efforts, the product is iconic enough that consumers search for it more often than non-branded terms. Search engines like Google, Bing and Amazon recognize the strength of the brand — SERP layouts and competitive pricing reflect this.

Brands working to improve their conversions need to work the entire marketing funnel. For brands or products well-known enough for branded search terms to be relevant to audiences, it’s important to know how audiences discover your products so you can target these branded search terms. Otherwise, you’re leaving money on the table for competitors and review sites to take for themselves.

Integrating branded search tactics into marketing strategy

With a strong brand, and thus stronger branded search terms, bread-and-butter search tactics will have some incredible advantages. These advantages span both paid search and organic search tactics, affecting every aspect of search from the page rankings, search boxes, knowledge panel, and even map results. With branded search, search engines will recognize your main site, if optimized for best practices, as the most relevant site for searching by potential customers.

Organic search

Your home page and (if applicable) product category pages should rank the best for high-traffic branded search terms. Your title tags and meta descriptions should clearly display these branded search terms and relevant context that encourages searchers to make the decision to click. Once they do, the site should match the promise the SERP listing made with this copy.

Your goal should always be to dominate the first page and to obtain the highest positions with optimal branded search efforts. Brands should not only focus on their branded terms at the user’s research and consideration phases of the funnel, but also the post-purchase phase.

In the research phase, searchers will find strong, relevant brands first and foremost. Consider that they will also be looking for reviews, pricing and where to buy the product. This information should be available to users prior to their converting.

But the job isn’t done after converting. Post-purchase, many users will search for more information about the product using branded terms — installation instructions, how-to guides, proper cleaning and maintenance techniques, general product help and more — and these searches should lead to your website.

All of the brand-related terms throughout the sales funnel have heightened search term relevance to affect consideration, conversion and continued use.

Paid search

If you plan to own as much real estate on the SERPs as possible, paid search is an essential tactic to earn qualified leads. Even if you have obtained the top ranking in organic search results, research suggests that having an ad can produce incremental clicks. With little competition, it’s pretty easy (and cheap) to own these paid search spaces.

With branded search ads, you should be making use of ad extensions. These will provide more information to searchers, which can make your ad stand out and entice users to click. Certain extensions — such as sitelink, location or price extensions — can also increase your listing’s SERP real estate, particularly on mobile.

Keep in mind that Google factors ad extensions into its Ad Rank calculation, so proper use of extensions can give you an edge over competitors who may be running conquest campaigns on your brand name.

Other branded search considerations

Beyond the basics of SERPs with organic and paid search listings, you should be taking advantage of additional branded search real estate options that should be taken advantage of, including:

Organic sitelinks, the links that are displayed under the top organic search result. They’re important since they occupy a lot of SERP real estate and can function as an outline of your site, helping users to navigate to your other top pages. Google determines whether it will provide sitelinks or not, so you don’t have direct control over this — but you can help Google out by submitting an XML sitemap and having your site set up with a logical hierarchy.

Apple shows six organic sitelinks for a branded search. Note the site search box, too.

You want to make sure map results are showing up for you if your brand or business has physical locations. To do this, you need to ensure that you’ve set up Google My Business listings with the correct NAP (name, address and phone) information.If your site has an internal search function, you then have a solid chance of a search box showing up on Google. If it doesn’t display within the SERPs, you can utilize structured data markup per Google’s guidelines.The Knowledge Graph helps users discover business information quickly and easily. Google will pull this information automatically from trustworthy sites like Wikipedia or WebMD. With the right mix of search tactics, you can obtain a Knowledge Graph result for your brand. Make sure that you have all social channels, a solid description, reviews and accurate information correctly displaying in the eye-catching Knowledge Graph.

Final thoughts

Branded searches are imperative and shouldn’t be overlooked. Many assume that search queries involving your brand will naturally lead to your website, but that’s simply not the case. Without optimizing your paid and organic search efforts to capture branded searches throughout the entire purchase cycle, you’ll be missing out on tons of potential new traffic and conversions. Owning as much real estate as possible for your brand is crucial, especially during high-traffic seasonality.

Oftentimes, branded search terms can be the last channel touch point for consumers who are about ready to convert on one of your products or services. By incorporating branded search into your overall digital marketing strategy, you can quickly accelerate your brand, helping it stand out on the SERPs and provide a better experience to audiences.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

AMP: A case for websites serving developing countries

Like Taylor Swift, Accelerated Mobile Pages (AMP) have a reputation. In a not-very-official Twitter poll, 53 percent claimed AMP was “breaking the web.”

What do you think about AMP?

— Maximiliano Firtman (@firt) March 23, 2017

The mobile ecosystem is already complex: choosing a mobile configuration, accounting for mobile-friendliness, preparing for the mobile-first index, implementing app indexation, utilizing Progressive Web Apps (PWAs) and so on. Tossing AMP into the mix, which creates an entirely duplicated experience, is not something your developers will be happy about.

And yet despite the various issues surrounding AMP, this technology has potential use cases that every international brand should pause to consider.

To start, AMP offers potential to efficiently serve content as fast as possible. According to Google, AMP reduces the median load time of webpages to .7 seconds, compared with 22 seconds for non-AMP sites.

And you can also have an AMP without a traditional HTML page. Google Webmaster Trends Analyst John Mueller has mentioned that AMP pages can be considered as a primary, canonical webpage. This has major implications for sites serving content to developing counties.

Yes, AMP is a restrictive framework that rigorously enforces its own best practices and forces one into its world of amphtml. However, within the AMP framework is a lot of freedom (and its capabilities have grown significantly over the last year). It has built-in efficiencies and smart content prioritization, and a site leveraging AMP has access to Google’s worldwide CDN: Google AMP Cache.

Source: “AMP: Above & Beyond” by Adam Greenberg

All of this is to say that if your brand serves the global market, and especially developing economies, AMP is worth the thought exercise of assessing its implications on your business and user experience.

What in the world-wide web would inspire one to consider AMP?

1. The internet is not the same worldwide

Akamai publishes an amazing quarterly report on the State of the Internet, and the numbers are startling — most of the world operates on 10 Mbps or less, with developing countries operating at less than 5 Mbps, on average.

If 10 Mbps doesn’t make your skin crawl, Facebook’s visual of 4G, 3G and 2G networks worldwide from 2016 (below) will.

Source: Facebook

The visuals show a clear picture: Developing countries don’t have the same internet and wireless network infrastructure as developed economies. This means that brands serving developing countries can’t approach them with the same formula.

2. Websites overall are getting chunkier

While all of this is happening, the average size of website is increasing… and rapidly. According to reports by HTTParchive.org, the average total size of a webpage in 2017 is 387 percent larger than in 2010.

Despite the number of requests remaining consistent over time, the size of files continues to trend upward at an alarming rate. Creating larger sites may be okay in developed economies with strong networking infrastructures; however, users within developing economies could see a substantial lag in performance (which is especially important considering the price of mobile data).

3. Mobile is especially important for developing economies

The increase in website size and data usage comes at a time when mobile is vital within developing economies, as mobile is a lifeline connection for many countries. This assertion is reaffirmed by data from Google’s Consumer Barometer. For illustration, I’ve pulled device data to compare the US versus the developing economies of India and Kenya. The example clearly shows India and Kenya connect significantly more with mobile devices than desktop or tablet.

Source: Consumer Barometer with Google

4. Like winter, more users are coming

At the same time, the internet doesn’t show any signs of slowing down, especially not in developing countries. A recent eMarketer study on Internet Users Worldwide (August 2017) shows a high level of growth in developing countries, such as India, at 15.2 percent. Even the US saw a +2.2 percent bump in user growth!

User penetration as a percent of a country’s total population shows there is still room for growth as well — especially in developing countries.

5. The divide in speed is growing

In the chart below, I choose nine developing countries (per the United Nations’ World Economic Situation and Prospects report) to compare with the United States’ internet speed (which ranked 10th worldwide in the last report). Despite the overarching trend of growth, there is a clear divide emerging in late 2012 — and it appears to be growing.

[Click to enlarge]

Why is this significant? As internet connection speeds increase, so do page sizes. But as page sizes increase to match the fast speeds expected in developed nations, it means that users in developing nations are having a worse and worse experience with these websites.

So, what should one do about it?

The data above paint a picture: Worldwide internet penetration worldwide continues to grow rapidly, especially in developing nations where mobile devices are the primary way to access the internet. At the same time, webpages are getting larger and larger — potentially leading to a poor user experience for internet users in developing nations, where average connection speeds have fallen far behind those in the US and other developed nations.

How can we address this reality to serve the needs of users in developing economies?

Test your mobile experience.

AMP isn’t necessary if your site leverages mobile web optimization techniques, runs lean and is the picture of efficiency; however, this is challenging (especially given today’s web obesity crisis). Luckily, there are many tools that offer free speed analyses for webpages, including:

Test My Site tool (via Think With Google)Page Speed Insights tool (via Google Developers)Mobile-Friendly Test (via Google Search Console)WebPageTest.org

Develop empathy through experience.

Allow yourself to step into your customers’ shoes and experience your site. As former CEO of Moz, Rand Fishkin, once aptly stated, “Customer empathy > pretty much everything else.”

Regular empathy is hard. Empathy for people you don’t know is nearly impossible. If we don’t see the problem, feel it and internalize the challenge, we can’t hope alleviate it.

Facebook introduced a 2G Tuesdays, where employees logging into the company’s app on Tuesday mornings are offered the option to switch to a simulated 2G connection for an hour to support empathy for users in the developing world. If you’re looking to try something similar, any Chrome/Canary user can simulate any connection experience through Chrome Developer Tools through the Network Panel.

Consider if AMP is right for your site.*

You should entertain the thought of leveraging AMP as a primary experience if your brand meets the following criteria:

Your site struggles with page-speed issues.You’re doing business in a developing economy.You’re doing business with a country with network infrastructure issues.The countries you target leverage browsers and search engines that support AMP.Serving your content to users as efficiently as possible is important to your brand, service and mission.

*Note: AMP’s architecture can also be used to improve your current site and inform your page speed optimization strategy, including:

Paying attention to and limiting heavy third-party JavaScript, complex CSS, and non-system fonts (where impactful to web performance, and not interfering with the UX).Making scripts asynchronous (where possible).For HTTP/1.1 limiting calls preventing round-trip loss via pruning or inlining (this does not apply to HTTP/2 due to multiplexing).Leveraging resource hints (a.k.a. the Pre-* Party), where applicable.Optimizing images (including: using the optimal format, appropriate compression, making sure images are as close to their display size as possible, image SRCSET attribute, lazy loading (when necessary), etc.)Using caching mechanisms appropriately.Leveraging a CDN.Paying attention to and actively evaluating the page’s critical rendering path.

Educate your team about AMP, and develop a strategy that works for your brand.

AMP has a plethora of great resources on the main AMP Project site and AMP by Example.

If you decide to go with AMP as a primary experience in certain countries, don’t forget to leverage the appropriate canonical/amphtml and hreflang tags. And make sure to validate your code!

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Links: To speed or not to speed

When we first started as an agency, our link builders were evenly split into two camps: One would send out a flurry of emails to all sorts of sites and deal with them if they responded. The other would spend a significant amount of time doing due diligence prior to outreach so that anyone who did respond had already been vetted.

I always thought it was a good idea to let each new link builder find his own way, so I didn’t usually express a strong opinion about this divide. I could see the points of view of both sides, too. Why bother doing a lot of work up front if the webmaster wasn’t even going to respond? Why disappoint webmasters who did respond when you couldn’t work with them?

On the whole, I have grown to favor the prior due diligence approach as opposed to casting a wide net. I’m firmly of the opinion that some link-building tasks absolutely do not benefit from being sped up.

However, I do think other areas of link building can be made faster and more efficient. It’s not always a bad tradeoff to invest a little bit less manual effort in one area to free up more time and energy for higher-priority tasks.

Today, I discuss several major link-building tasks in terms of whether they can (and should) be “sped up” — through automation, outsourcing or just spending less time on them.

Content creation

Useful, relevant content is what drives most link-building efforts, so content creation is a task that often falls to link builders (especially when pursuing guest posting opportunities). Creating content is very labor-intensive, though, so it’s understandable that link builders might look for ways to spend less time on it.

Can you speed it up? Yes. However, you can end up with some real garbage if you try to take shortcuts to create good content. I once experimented with outsourcing some content, and let me tell you, I got what I paid for (very little)! It was the most generic nonsense ever, and I had to correct a ton of typos and grammatical errors.

I’m not saying don’t outsource here; I’m saying don’t think that fantastic content usually happens quickly.

Should you speed it up? No! See above. I think that anyone can create decent content (for the most part), but not everyone can create great content that stands on its own. If you’re going to outsource, understand that great content usually doesn’t come fast or cheap.

Discovery of potential linking partners

Identifying websites from which you want to pursue links is an activity that involves a fair amount of research. There are programs that can automate parts of this process, however.

Can you speed it up? Yes. Discovery software can generate a massive list of potential linking partners much more quickly than if you were to do this task manually.

Should you speed it up? I’m 50/50 on this, actually. I was strongly against automating discovery in the past, but after using a tool that spat out a list of potential partner sites based on my criteria, I definitely understand its usefulness and efficiency. Sometimes, programs like these find something you didn’t see in your research. Just make sure you manually review your list of link prospects before reaching out.

Contact info gathering

Finding a potential linking partner is great, but not if you can’t figure out how to contact them. Link builders often need to spend time scouring a site to figure out who exactly to reach out to.

Can you speed it up? Definitely. With the way we review sites, it’s not usually a big deal to obtain contact info. However, if I had a big list of sites that I had vetted, it would be great to get the contact info quickly.

Should you speed it up? Yes, if you have a tool that does it. Just be aware that you may end up getting old email addresses or ones that aren’t the ones you want (like the IT director instead of the marketing director).

Due diligence

Performing due diligence work on a potential link partner requires time and effort. You need to make sure the website is relevant, authoritative, legitimate, free from penalties and adheres to whatever guidelines your client may have about linking partners.

Can you speed it up? Absolutely not. No no no no no. I verify that my link team has checked all the guidelines for each client, as well as our in-house guidelines, before we build the link. They’re good, but I catch a lot that they’ve missed. They do the same with me.

Due diligence for us is more than just metrics checking. We have clients who say, “No mommy blogs!” or will only accept links from sites hosted in certain countries, so it’s difficult to automate this well.

Should you speed it up? No. If you want great links, I would never speed up in this area. If you just want some crappy links for whatever reason, go for it.

Outreach

Reaching out to potential linking partners involves crafting emails (or private messages on social media platforms), which can often be quite time-consuming.

Can you speed it up? Yes — but I believe you should do so only if you have vetted the sites beforehand. You can speed it up no matter what, of course, but then you’re going to get replies from sites that aren’t the right fit if you haven’t done some upfront analysis.

Should you speed it up? I’m split on this one. As mentioned above, I think you can speed up outreach if you have vetted the sites beforehand. However, I prefer a more personalized approach, and that can’t really be sped up. I’d rather spend more time writing an email that gets opened and encourages a response.

Recently, a webmaster responded to me and said that while she couldn’t give me a link, I’d written the best email she’d seen in a long time, and she wished me luck. I uttered a small curse, but it really made me feel good about doing so much work on the initial outreach.

Social broadcasting

Promoting your content through social media channels can often lead to traffic — and links. This is a task that can be automated, at least to some extent.

Can you speed it up? Of course. You can use different tools to broadcast whenever you want to broadcast. If you need to reach people in different time zones, it’s probably easier to make that more automated. If you’re just doing social broadcasting for a small site with one new article, though, I’d do that manually.

Should you speed it up? As long as you don’t overdo it and bombard people with your content, I think it’s fine. My main concern is that if you do use automation for this, you run the very serious risk of inadvertently tweeting something inappropriate. I’ve seen many brands get crucified on social when there’s a mass shooting or earthquake, and they’re blasting you with info on how you need to buy those shoes right now or they’ll be gone.

The bottom line

People want new techniques or ways to make link building more efficient. Sometimes that just isn’t doable. Building good links is one of the most labor-intensive processes in SEO, and that’s one reason why it’s so frequently outsourced.

However, if you take shortcuts when you shouldn’t, you’ll probably end up spending extra time either removing those links or disavowing them — so I’d rather slow down and really intensively and manually evaluate a site before trying to get a link there.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.