AMP: A case for websites serving developing countries

Like Taylor Swift, Accelerated Mobile Pages (AMP) have a reputation. In a not-very-official Twitter poll, 53 percent claimed AMP was “breaking the web.”

What do you think about AMP?

— Maximiliano Firtman (@firt) March 23, 2017

The mobile ecosystem is already complex: choosing a mobile configuration, accounting for mobile-friendliness, preparing for the mobile-first index, implementing app indexation, utilizing Progressive Web Apps (PWAs) and so on. Tossing AMP into the mix, which creates an entirely duplicated experience, is not something your developers will be happy about.

And yet despite the various issues surrounding AMP, this technology has potential use cases that every international brand should pause to consider.

To start, AMP offers potential to efficiently serve content as fast as possible. According to Google, AMP reduces the median load time of webpages to .7 seconds, compared with 22 seconds for non-AMP sites.

And you can also have an AMP without a traditional HTML page. Google Webmaster Trends Analyst John Mueller has mentioned that AMP pages can be considered as a primary, canonical webpage. This has major implications for sites serving content to developing counties.

Yes, AMP is a restrictive framework that rigorously enforces its own best practices and forces one into its world of amphtml. However, within the AMP framework is a lot of freedom (and its capabilities have grown significantly over the last year). It has built-in efficiencies and smart content prioritization, and a site leveraging AMP has access to Google’s worldwide CDN: Google AMP Cache.

Source: “AMP: Above & Beyond” by Adam Greenberg

All of this is to say that if your brand serves the global market, and especially developing economies, AMP is worth the thought exercise of assessing its implications on your business and user experience.

What in the world-wide web would inspire one to consider AMP?

1. The internet is not the same worldwide

Akamai publishes an amazing quarterly report on the State of the Internet, and the numbers are startling — most of the world operates on 10 Mbps or less, with developing countries operating at less than 5 Mbps, on average.

If 10 Mbps doesn’t make your skin crawl, Facebook’s visual of 4G, 3G and 2G networks worldwide from 2016 (below) will.

Source: Facebook

The visuals show a clear picture: Developing countries don’t have the same internet and wireless network infrastructure as developed economies. This means that brands serving developing countries can’t approach them with the same formula.

2. Websites overall are getting chunkier

While all of this is happening, the average size of website is increasing… and rapidly. According to reports by HTTParchive.org, the average total size of a webpage in 2017 is 387 percent larger than in 2010.

Despite the number of requests remaining consistent over time, the size of files continues to trend upward at an alarming rate. Creating larger sites may be okay in developed economies with strong networking infrastructures; however, users within developing economies could see a substantial lag in performance (which is especially important considering the price of mobile data).

3. Mobile is especially important for developing economies

The increase in website size and data usage comes at a time when mobile is vital within developing economies, as mobile is a lifeline connection for many countries. This assertion is reaffirmed by data from Google’s Consumer Barometer. For illustration, I’ve pulled device data to compare the US versus the developing economies of India and Kenya. The example clearly shows India and Kenya connect significantly more with mobile devices than desktop or tablet.

Source: Consumer Barometer with Google

4. Like winter, more users are coming

At the same time, the internet doesn’t show any signs of slowing down, especially not in developing countries. A recent eMarketer study on Internet Users Worldwide (August 2017) shows a high level of growth in developing countries, such as India, at 15.2 percent. Even the US saw a +2.2 percent bump in user growth!

User penetration as a percent of a country’s total population shows there is still room for growth as well — especially in developing countries.

5. The divide in speed is growing

In the chart below, I choose nine developing countries (per the United Nations’ World Economic Situation and Prospects report) to compare with the United States’ internet speed (which ranked 10th worldwide in the last report). Despite the overarching trend of growth, there is a clear divide emerging in late 2012 — and it appears to be growing.

[Click to enlarge]

Why is this significant? As internet connection speeds increase, so do page sizes. But as page sizes increase to match the fast speeds expected in developed nations, it means that users in developing nations are having a worse and worse experience with these websites.

So, what should one do about it?

The data above paint a picture: Worldwide internet penetration worldwide continues to grow rapidly, especially in developing nations where mobile devices are the primary way to access the internet. At the same time, webpages are getting larger and larger — potentially leading to a poor user experience for internet users in developing nations, where average connection speeds have fallen far behind those in the US and other developed nations.

How can we address this reality to serve the needs of users in developing economies?

Test your mobile experience.

AMP isn’t necessary if your site leverages mobile web optimization techniques, runs lean and is the picture of efficiency; however, this is challenging (especially given today’s web obesity crisis). Luckily, there are many tools that offer free speed analyses for webpages, including:

Test My Site tool (via Think With Google)Page Speed Insights tool (via Google Developers)Mobile-Friendly Test (via Google Search Console)WebPageTest.org

Develop empathy through experience.

Allow yourself to step into your customers’ shoes and experience your site. As former CEO of Moz, Rand Fishkin, once aptly stated, “Customer empathy > pretty much everything else.”

Regular empathy is hard. Empathy for people you don’t know is nearly impossible. If we don’t see the problem, feel it and internalize the challenge, we can’t hope alleviate it.

Facebook introduced a 2G Tuesdays, where employees logging into the company’s app on Tuesday mornings are offered the option to switch to a simulated 2G connection for an hour to support empathy for users in the developing world. If you’re looking to try something similar, any Chrome/Canary user can simulate any connection experience through Chrome Developer Tools through the Network Panel.

Consider if AMP is right for your site.*

You should entertain the thought of leveraging AMP as a primary experience if your brand meets the following criteria:

Your site struggles with page-speed issues.You’re doing business in a developing economy.You’re doing business with a country with network infrastructure issues.The countries you target leverage browsers and search engines that support AMP.Serving your content to users as efficiently as possible is important to your brand, service and mission.

*Note: AMP’s architecture can also be used to improve your current site and inform your page speed optimization strategy, including:

Paying attention to and limiting heavy third-party JavaScript, complex CSS, and non-system fonts (where impactful to web performance, and not interfering with the UX).Making scripts asynchronous (where possible).For HTTP/1.1 limiting calls preventing round-trip loss via pruning or inlining (this does not apply to HTTP/2 due to multiplexing).Leveraging resource hints (a.k.a. the Pre-* Party), where applicable.Optimizing images (including: using the optimal format, appropriate compression, making sure images are as close to their display size as possible, image SRCSET attribute, lazy loading (when necessary), etc.)Using caching mechanisms appropriately.Leveraging a CDN.Paying attention to and actively evaluating the page’s critical rendering path.

Educate your team about AMP, and develop a strategy that works for your brand.

AMP has a plethora of great resources on the main AMP Project site and AMP by Example.

If you decide to go with AMP as a primary experience in certain countries, don’t forget to leverage the appropriate canonical/amphtml and hreflang tags. And make sure to validate your code!

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

SEO + UX = Success

In the good old days, SEO was simple. You stuffed a page full of keywords, and you ranked number one. Oh, if only it were that simple today! Now, Google (and the other search engines) literally take hundreds of factors into account when determining which pages rank high in search engine results pages (SERPs).

This new reality means that elements of user experience (UX) have been rolled into SEO best practices. How easy is your site to navigate? Do you have quality content that makes visitors want to stay and engage? Is your site secure, fast and mobile-friendly?

Think of the partnership of SEO and UX this way: SEO targets search engines, and UX targets your website’s visitors. Both share a common goal of giving users the best experience.

Here are some common website elements that impact both SEO and user experience.

Headings

Just as the headings of a printed work make it easier to find information, the headings of a web page make it easier for both visitors and search engine crawlers to understand and parse your content.

Headings (<h1>, <h2>, <h3>, <h4>, <h5> and <h6>) should tell the readers and search engines what the paragraphs/sections are about and show a logical hierarchy of the content. Headings also help users if they get lost on a page.

Only use one h1 tag on a page — that lets search engines and users know the page’s primary focus. H1s are normally the first piece of content on a page, placed near the top. (Think of h1s as the chapter title of a book.) Adding keywords toward the front of a heading can also help with rankings.

Other headers (h2 through h6) should follow h1s to structure and organize the rest of the page appropriately. The other headings can be used several times on a page, as long as it makes sense. You do not need to use all of them, either — sometimes your content may only need an h1 and some h2s.

Easy navigation and site structure

It may seem crazy that we’re still talking about easy site navigation… but we are. There are way too many sites out there that simply don’t get it. Your site structure is not only important for your users, but it’s your site’s roadmap for the search engines, too.

Remember that many of your visitors will not enter your site through your home page. This means that your site needs to be easy to navigate — no matter which page a searcher (or search engine crawler) lands on.

Your site’s navigation is not the place for fancy popups, a long list of options, hide-and-seek games or a place of dead ends where the user doesn’t know how to get back to another section of your site or get back to your home page.

Take a look at how healthcare giant Anthem’s menu overtakes the screen — on both desktop and mobile — when the menu is clicked:

With the menu literally filling the entire screen, a user can’t read the content that’s underneath the navigation. This creates a very poor user experience. When people are on mobile devices, chances are they won’t have the patience to deal with menus like this.

Additionally, a clean site navigation and structure can also lead to sitelinks appearing in Google search results. Sitelinks can help you take over more real estate on search engine result pages — which means less room for your competitors (and, hopefully, more clicks for you).

Google’s algorithm decides which sites get sitelinks (and which ones don’t). They base this decision largely on a site’s structure:

We only show sitelinks for results when we think they’ll be useful to the user. If the structure of your site doesn’t allow our algorithms to find good sitelinks, or we don’t think that the sitelinks for your site are relevant for the user’s query, we won’t show them.

User signals

I believe that user signals will increasingly become a more prominent factor in search engine rankings. Do you have Posts on Google My Business that visitors are clicking on? Are visitors on mobile devices using the click-to-call feature to dial your business? Are happy customers leaving five-star reviews for you — and are you responding to those reviews?

Although Google has denied that user signals such as time on site or bounce rate are direct ranking factors, studies have shown that there is a strong correlation between these signals and top rankings. Let’s put it this way: Google sees and knows everything. Every touch point and interaction your visitors have with you (and you have with them) shows Google that users are interested in and engaging with your content.

Site speed

Site speed has long been a ranking factor for Google search, and the company has even announced that mobile page speed (rather than desktop) will soon be used to determine this ranking factor. So not only is it important to have a website that loads quickly, but your mobile experience needs to be fast as well.

Google’s PageSpeed Insights tool allows you to enter your URL to see the issues your site might be having with mobile responsiveness. PageSpeed Insights measures how the page can improve its performance on both time to above-the-fold load and time to full page load and provides concrete suggestions for reducing page load time.

Amazingly, even the big sites with presumably large development and IT budgets have speed issues. See the poor results for the Harvard Business Review site:

Content-heavy and news sites should especially pay attention to site speed issues, since these sites are often viewed on mobile devices for the sake of convenience.

Mobile experience

When you think of “mobile experiences,” speed is definitely one consideration, but so is your mobile website as a whole — the look, feel, navigation, text, images and so forth.

Ever since Google released its mobile-friendly update in 2015, webmasters and SEOs have had to take “mobile-friendliness” into account as a ranking factor. And now, with the mobile-first index said to be coming in 2018, your mobile site will be considered your “main” website when Google’s algorithm is calculating rankings — thus making a good mobile experience all the more crucial.

Navigation is one of the most important components of a mobile experience — users and Google need to be able to find what they’re looking for quickly. Even button sizes and designs can impact user interaction on your mobile website. Every element on your mobile website impacts a user’s experience and directly (or indirectly) affects SEO as well.

In searching for an example of a local business’s mobile website, I found the one shown below. For this company’s mobile site, more than half of the above-the-fold real estate is taken up with meaningless information like huge logos and social media buttons. Plus, their menu is teeny-tiny and doesn’t even say “Menu” — it says “Go To…” and has the actual link to the menu to the far right-hand side. This does not make for a very user-friendly experience.

This company would be better off taking the clutter away from the top of the screen and making their menu, products and services more prominent for their mobile users.

Simple and smart design decisions like this will go a long way to making not only your visitors happy, but Google, too!

SEO and UX: A winning combination

Hopefully, you can see how SEO and UX go hand-in-hand in creating a successful website experience for both your human visitors and the search engines.

But what do you think? Do you think of your site’s users when you are creating content? How do you work with your design team to ensure that your site makes for a great mobile experience for your users? What is your balance between SEO factors and UX factors? We’d love to know!

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

19 technical SEO facts for beginners

Technical SEO is an awesome field. There are so many little nuances to it that make it exciting, and its practitioners are required to have excellent problem-solving and critical thinking skills.

In this article, I cover some fun technical SEO facts. While they might not impress your date at a dinner party, they will beef up your technical SEO knowledge — and they could help you in making your website rank better in search results.

Let’s dive into the list.

1. Page speed matters

Most think of slow load times as a nuisance for users, but its consequences go further than that. Page speed has long been a search ranking factor, and Google has even said that it may soon use mobile page speed as a factor in mobile search rankings. (Of course, your audience will appreciate faster page load times, too.)

Many have used Google’s PageSpeed Insights tool to get an analysis of their site speed and recommendations for improvement. For those looking to improve mobile site performance specifically, Google has a new page speed tool out that is mobile-focused. This tool will check the page load time, test your mobile site on a 3G connection, evaluate mobile usability and more.

2. Robots.txt files are case-sensitive and must be placed in a site’s main directory

The file must be named in all lower case (robots.txt) in order to be recognized. Additionally, crawlers only look in one place when they search for a robots.txt file: the site’s main directory. If they don’t find it there, oftentimes they’ll simply continue to crawl, assuming there is no such file.

3. Crawlers can’t always access infinite scroll

And if crawlers can’t access it, the page may not rank.

When using infinite scroll for your site, make sure that there is a paginated series of pages in addition to the one long scroll. Make sure you implement replaceState/pushState on the infinite scroll page. This is a fun little optimization that most web developers are not aware of, so make sure to check your infinite scroll for  rel=”next” and rel=”prev in the code.

4. Google doesn’t care how you structure your sitemap

As long as it’s XML, you can structure your sitemap however you’d like — category breakdown and overall structure is up to you and won’t affect how Google crawls your site.

5. The noarchive tag will not hurt your Google rankings

This tag will keep Google from showing the cached version of a page in its search results, but it won’t negatively affect that page’s overall ranking.

6. Google usually crawls your home page first

It’s not a rule, but generally speaking, Google usually finds the home page first. An exception would be if there are a large number of links to a specific page within your site.

No, but that's commonly the first page we find from a site.

— John ☆.o(≧▽≦)o.☆ (@JohnMu) August 24, 2017

7. Google scores internal and external links differently

A link to your content or website from a third-party site is weighted differently than a link from your own site.

8. You can check your crawl budget in Google Search Console

Your crawl budget is the number of pages that search engines can and want to crawl in a given amount of time. You can get an idea of yours in your Search Console. From there, you can try to increase it if necessary.

9. Disallowing pages with no SEO value will improve your crawl budget

Pages that aren’t essential to your SEO efforts often include privacy policies, expired promotions or terms and conditions.

My rule is that if the page is not meant to rank, and it does not have 100 percent unique quality content, block it.

10. There is a lot to know about sitemaps

XML sitemaps must be UTF-8 encoded.They cannot include session IDs from URLs.They must be less than 50,000 URLs and no larger than 50 MB.A sitemap index file is recommended instead of multiple sitemap submissions.You may use different sitemaps for different media types: Video, Images and News.

11. You can check how Google’s mobile crawler ‘sees’ pages of your website

With Google migrating to a mobile-first index, it’s more important than ever to make sure your pages perform well on mobile devices.

Use Google Console’s Mobile Usability report to find specific pages on your site that may have issues with usability on mobile devices. You can also try the mobile-friendly test.

12. Half of page one Google results are now HTTPS

Website security is becoming increasingly important. In addition to the ranking boost given to secure sites, Chrome is now issuing warnings to users when they encounter sites with forms that are not secure. And it looks like webmasters have responded to these updates: According to Moz, over half of websites on page one of search results are HTTPS.

13. Try to keep your page load time to 2 to 3 seconds

Google Webmaster Trends Analyst John Mueller recommends a load time of two to three seconds (though a longer one won’t necessarily affect your rankings).

14. Robots.txt directives do not stop your website from ranking in Google (completely)

There is a lot of confusion over the “Disallow” directive in your robots.txt file. Your robots.txt file simply tells Google not to crawl the disallowed pages/folders/parameters specified, but that doesn’t mean these pages won’t be indexed. From Google’s Search Console Help documentation:

You should not use robots.txt as a means to hide your web pages from Google Search results. This is because other pages might point to your page, and your page could get indexed that way, avoiding the robots.txt file. If you want to block your page from search results, use another method such as password protection or noindex tags or directives.

15. You can add canonical from new domains to your main domain

This allows you to keep the value of the old domain while using a newer domain name in marketing materials and other places.

16. Google recommends keeping redirects in place for at least one year

Because it can take months for Google to recognize that a site has moved, Google representative John Mueller has recommended keeping 301 redirects live and in place for at least a year.

Personally, for important pages — say, a page with rankings, links and good authority redirecting to another important page — I recommend you never get rid of redirects.

17. You can control your search box in Google

Google may sometimes include a search box with your listing. This search box is powered by Google Search and works to show users relevant content within your site.

If desired, you can choose to power this search box with your own search engine, or you can include results from your mobile app. You can also disable the search box in Google using the nositelinkssearchbox meta tag.

18. You can enable the ‘notranslate’ tag to prevent translation in search

The “notranslate” meta tag tells Google that they should not provide a translation for this page for different language versions of Google search. This is a good option if you are skeptical about Google’s ability to properly translate your content.

19. You can get your app into Google Search with Firebase app indexing

If you have an app that you have not yet indexed, now is the time. By using Firebase app indexing, you can enable results from your app to appear when someone who’s installed your app searches for a related keyword.

Staying up to date with technical SEO

If you would like to stay up to date with technical SEO, there are a few great places to do that.

First, I recommend you watch the videos Barry Schwartz does each week.Second, keep your eye on Search Engine Land.Third, jump on every blog post Google publishes on Google Webmaster Central.Finally, it is always a good idea to jump into a Google Webmaster hangout or simply watch the recording on YouTube.

I hope you enjoyed these 19 technical SEO facts. There are plenty more, but these are a few fun ones to chew on.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Accelerated Mobile Pages (AMP) conquer the competition for shoe retailer

In the highly competitive footwear vertical, no season matters more than late summer, when shoppers spend $27 billion on supplies and clothing for the coming school year.

According to the Deloitte back-to-school survey for 2017, some 55 percent of that spend, about $15 billion, is devoted to clothing and accessories. Late summer may be only the second-biggest shopping season of the year in the United States, but for verticals like footwear, it’s number one.

A top shoe retailer came to Brandify (disclosure: my employer) for a solution to boost local store visibility online. To achieve the retailer’s goal, we worked in collaboration with SEO consultant Steve Wiideman to implement Accelerated Mobile Pages (AMP) for the retailer’s nearly 500 US stores.

The open-source AMP Project, led and heavily promoted by Google in collaboration with Twitter, WordPress, Pinterest and LinkedIn, defines a lightweight standard for publishing web pages that makes them load very quickly on mobile devices. The standard includes special implementations of HTML and JavaScript, as well as the concept of an AMP Cache, which is a repository for serving pages converted to AMP.

Google’s AMP Cache is by far the biggest, and since early 2016, Google has been featuring AMP results prominently, first at the top of the mobile SERP in “zero position,” and later in the year as part of the ordinary list of search results. Google has reported that pages converted to AMP typically load in less than half a second and use one-tenth of the data used by the same page in non-AMP format.

It would seem like a no-brainer to use AMP for local store pages, and yet the local search industry has been slow to adopt the standard. During the first phase of the AMP Project’s rollout, it was believed that AMP, with its stripped-down publishing format, was only suited to news sites and blogs, where presentation of text content is the main point of the web page.

This began to change when eBay launched 8 million AMP product pages last summer, proving that e-commerce sites could benefit from fast page loads and simplified presentation. As Brafton’s Ben Silverman wrote on his company’s blog, “The auction site’s confident leap into the world of the accelerated mobile experience proves that fast-loading, neatly formatted, easy-to-use content is the best way to drive conversions and sales.”

.mktoButton{background:#000!important;}

Keep up on the latest developments with AMP and more. Get regular updates in your inbox.

We were eager to bring the benefits of AMP to our multilocation brand clients, and the shoe retailer’s request for a boost in traffic created a good opportunity. The switch to AMP involved a modest redesign of the local page layout for the brand, though because the retailer already preferred simplicity and utility in its web pages, the changes did not need to be dramatic.

The possibilities for interactive content are limited with AMP, and the presentation must remain simple, but developers and brands should not shy away from AMP for that reason. After all, quick access to relevant information is what mobile searchers want.

This supposition was borne out by the results for the shoe retailer. Even though AMP implementation by itself is not considered to be a ranking factor, the improvement in page design and load time correlated with a notable increase in session traffic.

Comparing the 20-day periods before and after the launch of pages converted to AMP on July 27 of this year, we saw an increase, period-over-period, of 32 percent in overall session traffic. What’s more, the impact was noticeable almost immediately on July 28, one day after launch.

Screenshot of Google Analytics showing AMP deployment on July 27 and subsequent spike in sessions.

The year-over-year results were even more dramatic, with sessions increasing 45 percent between July 28 and August 17, 2017 over the same period in 2016. Other factors may have contributed to this increase, but the immediate jump in traffic upon AMP launch is hard to deny as evidence of AMP as an isolated and significant contributor.

We also examined the retailer’s Google My Business (GMB) Insights and found a possible add-on effect. Greater prominence of local pages for the retailer seems to correlate with increased views and actions on Google listings for the brand.

Comparing the 20 days before and after launch, we saw a 9.4 percent increase in customer actions for the retailer’s Google listings, such as clicking to visit the brand website, requesting directions and clicking to call. Moreover, comparing the first 20 days after the launch of pages converted to AMP to the same period one year before, we measured a 21.3 percent increase in customer actions.

GMB Insights for shoe retailer shown in the Brandify dashboard

The implication of this result is that Google can connect pages hosted within its own AMP Cache with their corresponding website links in a store’s GMB listing. Performance of one’s business website is a known ranking factor for local listings, and AMP appears to be a great weapon for boosting local as well as organic results.

The retailer benefited significantly from the switch to AMP over a remarkably short period of time, ensuring the brand would remain at the forefront of consumer attention during the competitive back-to-school season. During the time period of the AMP campaign launch, no other significant changes were made to the retailer’s local campaign, so we feel we can claim with confidence that barring any external factors, AMP was the major driver of the positive results we measured.


Want to learn more about this case study and others related to AMP? Join us in New York for our SMX East search marketing conference, and be sure check out the “AMP: Do or die?” session, featuring the author.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

From big to small: 5 free image compression tools reviewed

Recently, I’ve found myself focusing more and more on optimizing page load times. Improving page speed is something that is generally pretty easily understood by clients, and it positively impacts user experience and conversion as well as SEO.

The challenge, frequently, can be that some elements of improving page speed can require significant input from development resources (e.g., prioritizing visible content, eliminating render-blocking JavaScript or CSS).

But there’s one element of page speed optimization that even non-technical marketers and content creators can contribute to: image optimization. As Kristine Schachinger points out in her excellent article on image optimization, resizing and compressing images can often be the easiest and highest-impact action for speeding up pages on your site.

Schachinger does a great job of outlining image compression and resizing best practices, but once you know which images need work (or if you just have some new images to add to your site), what’s the best tool for actually compressing images?

Since image compression can be such an easy win, I wanted to test the capabilities of five different free, standalone image compression tools that writers, designers or marketers can use to ensure that they’re keeping their image file size in check.

For this post, I ran three images from a site I own through each of the tools:

A logoA large slider image that hadn’t been compressedAn infographic we created that hadn’t been compressed

Two of the images were PNG, and one was JPG, and each had been generated without any focus on optimizing for size or for ultimately being compressed (as will frequently be the case “in the wild”).

There are a lot of different free image optimization tools. While I’m sure I’m not aware of all of them, I’ve checked out 15 to 20 different tools and found these five to be the best suited for most purposes. I think its likely that most people reading this post will find that one of these five is a good fit for their image compression needs.

Let’s run through the various tools I used to compress the images.

1. Optimizilla

Optimizilla has a very simple interface:

The biggest “pros” in Optimizilla’s favor are that the tool allows you to run up to 20 images through for compression at a time (a couple of the others on this list do as well) and that it has a great image preview feature which lets you dial up or down the “quality” of the photo.

This quality slider feature allows you to adjust your image compression based on whether the resulting image will look acceptable. Dialing down the quality shrinks the size of the image, so if you can’t see much of a difference at 60 percent versus 90 percent image quality, you may want to dial things down to 60 percent to reduce the image size as much as possible.

2. TinyPNG

Like Optimizilla, TinyPNG has a nice, simple interface and allows you to run up to 20 images at a time. It also has a convenient “export to Dropbox” option:

3. Compressor.io

Compressor.io also offers compression. Unfortunately, there’s not a bulk upload feature here, so you have to upload one image at a time:

4. Kraken

Kraken does allow you to upload multiple files. It also has some nice features, like allowing you to easily export files to Dropbox or import files from Box, Dropbox or Google drive. Additionally, Kraken allows for “advanced” customization, like altering quality and orientation and preserving metadata for your photos.

The major downside to Kraken is that it was the only tool on the list that wouldn’t execute compression for all of the files in the free version of the tool. Our large infographic (which was a very big file at 1.7MB) hit their free cap. Their pro plans currently range from $5 to $79 a month.

5. Gift of Speed

Gift of Speed offers separate PNG and JPG compressors. It features bulk upload for PNG, but not for their JPG compressor.

What about the results?

As you can see below, all of the tools had a significant impact on image size. It’s worth noting that in each instance, I took the default version of the compressed image from each tool — I likely could have experienced even larger gains by tweaking the advanced settings in tools like Kraken and Optimizilla.

As you can see, on raw default performance, TinyPNG had the smallest file size for the two larger images, and Compressor.io had the smallest size for the logo. If you’re curious about before and after image qualities, I’ve uploaded all of the files to a public Dropbox folder here.

If you’re looking for a tool for your team to use that is very easy to use and outputs a much smaller image by default, I’d probably recommend TinyPNG. Personally, I’ll frequently use Optimizilla, as I find it’s the best combination of a simple interface with some easy-to-use customization (where I can preview and dial down the quality to help shrink a file if necessary).

How does this actually impact things?

You might be thinking, “OK, great, so you made these image files much smaller — but how much of an impact does that really have on how quickly my page loads?”

To help demonstrate how dramatic the impact can be, I created a page on a WordPress site and first added the initial images (and nothing else) to the body of the page and logged the total load time for the page, then removed them and replaced them with the smallest version of each file.

Page load with all of the original images added to a page:

5.882 seconds

Page load with the best-optimized version of each image added to a page:

2.369 seconds

So, page load time decreased by more than half once the three images were optimized.

Now, this is obviously a particularly pronounced impact given the presence of the large infographic file, but your logo (and similar files) are present on every page on your site, and there’s a good chance that between blog posts, product images and other areas of your site, you have several non-optimized images. How much of an aggregate page load impact (and ultimately how much traffic and revenue) are you forfeiting by using these unoptimized images?

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

The non-developer’s guide to reducing WordPress load times up to 2 seconds (with data)

With Google’s continued focus on user experience and engagement metrics in recent algorithm updates, it’s become even more important for marketers to pay attention to how fast their sites are. Page speed has long been a ranking factor for desktop search results, and it may soon impact mobile rankings as well.

The benefits of improved load times go well beyond their impact on SEO and your site’s organic rankings, however. Consider recent Google data, which shows that “53 percent of visits are abandoned if a mobile site takes longer than three seconds to load,” or that “for every second delay in mobile page load, conversions can fall by up to 20 percent.”

So, how do you actually go about speeding up your site? For many non-technical marketers, trying to figure out how to improve page speed can be a daunting task. Which levers should you actually be pulling to generate a result? And how do you get those changes implemented on your site?

I’m not a developer. My company owns and operates a number of different (relatively simple) publishing sites built on WordPress. I set about working to improve load times for these sites without any developer intervention to see what kind of impact could result from some simple tweaks that anyone (even me!) could make. In this post, I’ll walk through each of the optimizations, explain what the impact on our sites was and share actual data around load times, Google Speed scores and more.

Three important points I’ll return to later in the post:

As I said, these are fairly simple sites built in WordPress, so the plugins and solutions here are all WordPress-specific.A more complex site built on a different platform with different functions (e.g., e-commerce sites, more complex publishing sites) will have a lot of additional, more complex concerns and will also respond differently to these tactics than our sites did.Don’t let the perfect be the enemy of the good when it comes to page speed; developers may tell you that to achieve a pure “best practice” page load time, you need to redesign your site (which may not be practical for your company at the moment). While that may be the case, there’s likely some combination of the tactics outlined below that you can implement to help improve page speed. Help developers to focus on the right metrics (which we’ll outline below) and work to get better.

All of that said, this post (and understanding some of the basic levers available to improving page speed) should help you better understand the potential for speed improvements on your site.

What are you optimizing for? Choosing the right page speed metrics

Like a lot of SEOs starting out, I focused my efforts on page speed, based on Google’s free PageSpeed Insights tool recommendations. It’s straight from Google’s mouth, gives very easy-to-understand metrics (a grade, just like school!) and has useful suggestions for speeding up page load times.

The tool can definitely be helpful, but as you dig into page speed, you may recognize that:

the grading can be a bit wonky. Sometimes you will speed up how quickly your page loads, and your score will drop. Sometimes you’ll do nothing, and the score will move around some. Remember that the holy grail here is to speed up your site for your visitors, so don’t just study for the test!it doesn’t appear likely that Google is using this score as a ranking signal so much as load times in Google Chrome relative to other sites within the search results you’re appearing in, as Search Engine Land contributor Daniel Cristo points out.

That bolded bit above about your speed being factored in relative to your SERP competitors is very important; if you have a simple B2B site, you may look at a successful e-commerce site and say, “Their site is way slower than ours and does great! We should be fine!” But the reality is, that’s not who you’re competing with. You want your site to be as fast as it can be, so you should be comparing it to the sites that are ranking in the most important search results for your site.

So, if we’re not using Google’s PageSpeed Insight tool scores as the be-all and end-all for our optimization efforts, what metrics should we be focusing on?

Search Engine Land columnist Chris Liversidge does a great job of breaking this down in further detail in his excellent post on different page speed events, but effectively my focus was on:

time to first byte (TTFB) — How quickly after a request is made that your server and/or CDN (which we’ll get to in a bit) sends the first byte of data.critical render path/start render — Essentially, when your “above the fold” content is rendered.full page render — When the entire page is loaded.

Again, we want to focus on the user experience on our site, so making sure that the content above the fold is delivered lightning-quick and that the entire page loads quickly are really the main concerns. The TTFB metric (while imperfect) can be helpful in that it lets us know if our load time issue is a result of server problems.

So these are our metrics. How do we know if our pages are even slow, though?

Page speed measurement tools

First, we’ll need a tool to measure them. Fortunately, there are a lot of great free tools for these purposes. I used Web Page Test, which lays these out pretty simply. Here are the results for Search Engine Land, which are quite good for such a visual home page and a large and complex publishing site:

Where tools are concerned, there are a lot of options to measure speed and get suggestions, including:

Pingdom’s free toolGT MetrixKey CDN’s free toolVarvy’s Tool

And others. For our purposes here, I’ll be using data from Web Page Test.

What are our goals? What’s a ‘good’ page speed?

Again, the page load times will vary significantly from niche to niche and SERP to SERP, so our initial goal should simply be “get better.” That said, let’s look at some general best-practice guidelines around target times for these events:

TTFB — Ideally under 200 ms (milliseconds), at least under 500 ms (A Moz study from a few years ago found that many top-ranking sites had a TTFB of 350 ms, while lower-ranked sites were frequently closer to 650 ms.)Start render — Ideally under 1 second, at least kept under 2 seconds.Full-page render — Ideally under 3 seconds, at least kept under 5 seconds. (Google’s John Mueller recommends under 2- to 3-second load times and mentions there that he uses Web Page Test as well.)

Again, if your full page load times are coming in at 15 seconds, and it’s 5 seconds before critical path rendering is complete, don’t just throw up your hands. Start optimizing and work to get those numbers down, even if you may not be able to get them to under a second.

Faster is better!

OK, so what can you actually do to improve page speed?

Let’s say you measure your page speed, and it’s slow; what can you do to make a difference?

The most common suggestions from Google’s PageSpeed Insights tool (and from optimization experts) include:

reducing server response time.enabling compression.leveraging browser caching.eliminate render blocking code above the fold (CSS and JS).minify code.compress and resize images.

I’ll walk through here what optimizations I was able to implement on four different WordPress sites, and the before/after load times resulting from those optimizations. Again, these numbers won’t be true for all sites, and not every best practice will have the same impact on every site. But I think that through this process, you’ll see how some simple, quick optimizations can have a major impact on speed.

Please note that while there are some optimizations you can make with a basic understanding of HTML, there are some inflection points where it’s important to get a technical resource to jump in and figure out the best way to improve your site’s load times.

Step 1: Benchmarking our page speed metrics

First, I took a snapshot of each site’s page speed metrics on the site’s home page and a deeper article page. I did this specifically for the purposes of this post. If you’re looking to optimize your site, you’ll ideally want to look at metrics sitewide, or at least on a sampling of your highest-traffic pages and across a typical page for each template on your site.

Compared with some sites, these numbers aren’t terrible — but for simple content sites built on WordPress without a lot of bells and whistles, there’s definitely a lot of room for improvement.

What we did was implement four different commonly recommended page speed optimizations. Below, we’ll see the impact of each optimization as it was implemented, and then the cumulative impact of all of the optimizations.

So let’s dig into the optimizations.

Step 2: Code clean-up

Google’s PageSpeed Insights tool recommended we “minify” each of the sites’ CSS, JavaScript and HTML. For this, we used a free WordPress plugin called Autoptimize. It took about 20 minutes to set up across the four sites:

After optimizing HTML, JS and CSS and loading the JS and CSS inline, Google’s tool moved minification and “Eliminate render-blocking JavaScript and CSS in above-the-fold content” into the “Optimizations Found” column:

What was the impact?

As you can see, a majority of the pages saw improvement, and some saw significant 20-percent-plus upgrades. But in some cases, there was very little percentage improvement, or even worse performance. There is some variance from test to test, but what you see is that while these improvements will generally improve page speed, their level of impact varies and is dependent on the site.

Please note: This is the free version of the plugin with a “best guess” at ideal settings. Be careful in making changes to your site’s code, and as I’ll mention later in the post, this is a particular area where you may want to look to a developer for guidance.

Step 3: Browser caching

Next, we wanted to leverage browser caching. Typically, you can use WordPress plugins like WP Super Cache or WP Rocket for this purpose, but these sites were hosted on WP Engine, which has some compatibility issues with some of those plugins. So, we simply enabled the WP Engine object caching:

What was the impact?

As you can see, this had a more dramatic impact than our code cleanup efforts, and for two of the sites we saw dramatic performance improvements of 20 to 30 percent, with just two pages seeing an uptick in start render time.

Step 4: Implement a CDN

Again, WP Engine has its own CDN option, so we enabled that, which is also an extremely simple process in WP Engine:

What was the impact?

This is the first implementation where we actually saw a marked drop-off in performance. A few things to note there:

A CDN is implemented to improve page speed for users in different parts of the country, so theoretically its (positive) impact should be more for different test cases in different areas around the world.Again, this was just one test against a small number of sites — other folks have tested the WP Engine CDN and seen better results, so your mileage may vary there.

The important takeaway here is really more that, once again, not every optimization will have the same impact on every site, and occasionally, some efforts will have minimal or no real impact.

Step 5: Image optimization

Finally, we worked on compressing and resizing images on each of the sites. In some cases, the images on the sites had already been compressed, and the biggest culprit was (as you’ll see) the home page for site four. I find that on sites that have been ignoring it, image optimization is frequently the quickest, easiest and highest-impact page speed win.

To do this, we used an image compression plugin called Optimus. We also compressed and resized each of the images on the pages “by hand” to make sure compression didn’t impact quality and that the files were as small as possible:

There are a number of image optimization plugins for WordPress including, but not limited to:

OptimusWP SmushEWWW Image CompressionShort Pixel Image Compression

Whenever you use these kinds of plugins, you do have to be cognizant of potential image quality/rendering issues somewhere on your site if you’re applying them to all of the existing images in your media library. (After some additional testing/recommendations from page speed pros — more on that below — we actually switched over to Short Pixel.)

There are also a number of tools available to compress individual images before you upload them as well, including:

OptimizillaCompressor.ioTiny PNG

And there are many others as well.

What was the impact?

As you can see from a couple of the home pages, compressing images can lead to some of the biggest page load wins. Best of all, compressing images and replacing the uncompressed versions is a task for which you won’t generally need any kind of development help.

Again, though, the level of improvement is dependent upon the site. Sites where images have already been compressed and resized (or just happened to be smaller) will obviously see little to no gain from this particular step.

So, what was the cumulative impact of our efforts?

As you can see, the results here vary from site to site. But we’ve shaved as much as 2 seconds off of load times by following these steps, and in almost every instance, we improved the speed at which visitors are seeing our above-the-fold content.

But some load times actually got a bit worse for all of our efforts, and it seemed that for a simple site, we should be able to beat a lot of these load times. So, what else can you do?

BONUS STEP: Hire a pro!

After getting a significant yield from some of my amateur DIY efforts, I decided to go ahead and hire a developer specializing in page speed optimization. Our specialist worked on driving load times down even further. Specifically, they:

removed or replaced plugins in my WordPress configuration that they identified as slowing the site down.tweaked code, server configuration and configuration of the speed optimization plugins that I had installed.

This process actually cut our (improved) load times in half. This is a great example of how a developer well-versed in page speed best practices can dramatically improve your results.

If you have a development resource internally, communicate your goals (reducing page load times and the speed that a user sees important, above-the-fold elements), and if necessary, share resources to ensure they’re aware of best practices for speeding up a site.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Don't follow the leader: Avoid these 5 common e-commerce SEO mistakes

Competitive research is an important part of any SEO program — after all, it’s a zero-sum game that we’re playing. However, there is often a tendency for companies to become fixated on what dominant competitors in the marketplace are doing. The assumption is that because they’re getting the most SEO traffic, they must be doing things right.

In many industries, it is true that the high SEO traffic sites really are doing an exceptional job. But in the world of e-commerce, this is often not the case. Many of the highest traffic e-commerce sites are doing things that are objectively bad for SEO. It turns out that a strong backlink profile and other prominent brand signals can make up for an awful lot of mistakes.

Getting things right for enterprise e-commerce SEO can be really challenging. You often have to merge very different sources of product data into a single system and make everything work. There are more pages than you could ever curate manually. And in most cases, SEO is not the largest driver of traffic and may have to take a back seat to other priorities. It’s tough.

Eventually, people are going to figure out how to address the issues that make e-commerce SEO so cumbersome and hard to scale. Sites that apply these new techniques will gain an advantage, and then everyone will race to copy them and this article will be outdated. I believe that point is still some years away.

Until then, there are opportunities to gain an SEO advantage over most of the major e-commerce players by simply avoiding their most common mistakes.

1. Faceted navigation disasters

When faceted navigation isn’t controlled, you can often end up with more category URLs, by orders of magnitude, than total products on the site. Clearly, something is wrong with that picture.

On the other end of the spectrum, you have companies that are so scared of creating too many pages that they noindex their entire faceted navigation or canonical everything to the root page. Doing this can prevent indexation of potentially valuable pages (usually ones with two or one attributes selected) and it still may not fix the crawl problems that their navigation poses.

There is a middle path, and few try to walk it. While fixing your filtered navigation is an entire topic of its own, a good starting point is to consider using dynamic AJAX sorting for thin attributes, so users can refine the product set without changing the URL.

2. Slow site speed

There is plenty of readily available data about the impact of site speed on conversion and bounce rates. A couple of seconds can make an enormous difference in user engagement. So why do retailers seem to be competing to load the most external scripts? The retail market is underinvested in speed and overinvested in lag-inducing features that often have marginal benefits and may even serve to overwhelm the user.

My experience is that the SEO benefits of page speed are not yet as substantial as the conversion optimization impact. With all the information Google is sharing about the user benefits of fast, streamlined sites, it’s only a matter of time until speed becomes a more prominent ranking factor. However, when UX impact is also taken into account, there’s no reason to wait.

3. Reliance on XML sitemaps for indexation

If there is one simple piece of SEO wisdom that every enterprise manager should remember, it’s that each page needs to have a crawl path to have a chance to rank for competitive queries. There are many unique and exciting ways (from the perspective of someone who is paid to fix websites) that sites are able to orphan a large percentage of their product or other important pages from their browsable architecture.

Possibilities include broken pagination, creating nearly infinite URL spaces, and any form of link generation logic that doesn’t systematically ensure that every product has a crawl path.

If you’re unsure about whether you have an adequate crawl path, crawl your site and see if all your important pages are showing up. If you are not able to do a complete crawl of your site, that means either that you have too many pages or you need a better crawler. If you have a very large site, you likely need help with both. And if you’re spending lots of time looking at the sitemaps dashboard in Google Search Console, wondering why your pages aren’t being indexed, it’s most likely because they don’t have a good crawl path.

4. Using tags completely wrong

Many e-commerce sites have conflicting tagging signals on their category pages and tagging structures that are suboptimal. I have seen at least two Fortune 500 owned e-commerce sites that were making all the pages on their site canonical to the home page, which is equivalent to telling Google that none of the other pages on the site have anything else to offer. I have seen more sites than I can count on one hand do their pagination tagging incorrectly, which is surprising, because it’s a plainly spelled-out specification.

I suspect that Google’s assumed omniscience sometimes hinders the careful adoption of standards. People think they can get it close enough and Google will figure it out. Sometimes they do. Sometimes they don’t. And sometimes, even if Google can figure out all your mistakes, it’s still a loss — especially if they are having to crawl extra pages to do so.

5. Ugly URLs

Here’s a thought experiment. Let’s set SEO aside for a moment and look at two different URLs that we might see in a SERP:

Site 1: www.madfancylad.com/c/armani-fedoras

Site 2: www.bromendous.com/search?product%20line=fedora&brand=Armani&REFID=23ghaWHY23093482

Which site seems more likely to make things easy for their shoppers, and which site seems more likely to make things easy for themselves? What kind of conscious and unconscious assumptions might a shopper make about each?

My experience is that short, clear and concise URLs tend to rank well and get more traffic than long, parameter-laden addresses. There are some correlational studies that support this observation. I don’t consider any of them definitive — but I know what I would choose to do for my site.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Load time, static site generators & SEO: How we lowered our own site's load time by 74%

Google has raised the bar on site load times consistently over the last decade, and the upcoming transition to mobile-first indexing, combined with its rising expectations of mobile site performance, should be a clear warning sign to site owners.

Site owners generally, however, seem not to be listening.

Unfortunately, based on our analysis of 10,000+ mobile web domains, we found that most mobile sites don’t meet this bar: The average load time for mobile sites is 19 seconds over 3G connections.

DoubleClick research published in September 2016

At our company, we have been experimenting over the last year with static site generation. Our tests on our own site are aimed at allowing us to assess the challenges facing site owners, to understand the scope of opportunity and potential for performance improvement, and also to explore the practical limitations in content management — one of the key criticisms of static site generation.

Our site, QueryClick.com, was a small, fairly well-optimized B2B site, but it averaged ~6.99 second load time in the month prior to our deployment of static site generation (July 2016), dropping to ~1.8 seconds in the month following. That represented a load time reduction of 74.29 percent, despite some server response issues experienced during the period we were actively developing the site.

One month before and after switching to a static site generation infrastructure.

We performed further server optimization improvements over the year, reaching our sub-one-second mobile device target even while testing the impact of less efficient elements driven by JavaScript.

Yes, we know! We didn’t even use sprites, gzipping, or other such techniques — which highlights the impact of a platform-first approach to solving the page speed problem.

A platform-first approach to page speed

I’ve written before about the varying levels of importance of the different aspects of page speed on SEO and about how Google’s algorithm employs data about SERP bounce-backs (when users bounce back to its SERPs after losing patience with a slow-loading site). But it’s worth making the point again as we head to a mobile-first world: server response times and the critical render path event (the point at which everything in the initial device viewport is rendered) are key to delivering high-performance SEO, especially for enterprise-level sites.

Any developer worth their salt will look at the asset load requests in QueryClick’s site displayed in the above image and shake their head at all the unoptimized elements. But that is the point. High performance was achieved despite a lack of rigorous optimization in the code and asset deployment. It was driven by the platform and the high-level architecture decisions.

Google would like us to improve further — and we will, but real performance change has already been delivered.

So, what architecture did we use? As Python and Django evangelists, we write the copy in Markdown and push it to our staging server build via Github, where we can check to ensure it’s all OK. We then use Celery to set a time for the staging server copy to be pushed to the live Git repository. Then, Cactus regenerates the pages and, voila, the live server is populated with the static pages.

Of course, for your average content producer, this infrastructure is not as simple to create or maintain as a standard CMS without some technical know-how. That is the most common criticism of static site deployments, and many enterprise clients consider it a deal-breaker when they’re looking at static site solutions.

Certainly, if you manage product inventory that dynamically changes by tens of thousand in a day, which one of our clients does, then a robust management back end is essential.

That’s why anyone deploying a static site performance solution in the enterprise must leverage Oracle ATG or the like, which can easily generate and manipulate static web pages using its API. When you think about it, live dynamic site management requires significantly more hardware infrastructure than static.

If you need more convincing, take a look at the variety of static code bases already in flight. They use a variety of programming languages and many of them are fully capable of being fitted into an enterprise environment. When you also use a content delivery network (CDN) in production, you can offer a robust solution that delivers both blazing speed (for even poor 3G mobile connections) and total redundancy and elimination of server failure challenges.

Dynamic static asset provision and modern caching controls on static generators allow clean, live adjustment of content that is exactly comparable to dynamic site generation at a fraction of the hardware demand.

It may take years for the general web to catch up with Google’s pioneering push for modern, fast, easy-anywhere web experiences. But if you want to benefit your conversion rate and your brand experience, and enjoy a significant drive to SEO performance thanks to fast critical render performance and positive SERP bounce behavior, then you should have your development team investigate and find an architecture that works for your site today.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Breaking news: When AMP goes through the roof

Disclosure: at the time of the writing of this article, the author was the head of product for an AMP conversion platform company. That company’s technology was acquired by Google on October 9th, 2017.

The audience for Accelerated Mobile Pages (AMP) has been growing steadily over the past year, from single-digit to low double-digit percentages of traffic for many websites.

Beneath that topline number is an interesting trend: For many news publishers, certain viral and breaking stories are now consumed mostly in AMP format. And on a huge news day, AMP can be a large slice of a publisher’s overall web traffic.

In those defining moments, a fast and full-featured AMP experience can help a publisher put its best foot forward.

How can AMP be that big?

First, let’s unpack a typical news publisher’s AMP traffic. Adobe reported that AMP contributed 7 percent of total web traffic to top news sites in December 2016. Since then, Google has expanded AMP exposure in search, and other platforms such as Twitter have started linking to AMP pages.

With that in mind, let’s say AMP is now around 10 percent of traffic for a typical news website, give or take. For most websites, that traffic is concentrated in mobile article views sourced from Google, Twitter and other referrers that link to AMP. (Publishers could change that by adopting canonical AMP, as some e-commerce sites have done, or by AMP-enabling more pages.) AMP might be upwards of 30–50 percent of a publisher’s mobile article views on an average day, depending on mix of content and referral sources.

An analysis of dozens of local and national news websites shows that AMP can spike far above that baseline on a regular basis. For viral and breaking news stories, AMP can contribute more than 80 percent of combined desktop and mobile web views. For those stories, AMP is the primary consumer experience, and the non-AMP page is ancillary.

What causes AMP spikes?

Breaking news and viral stories over-index in mobile search referrals because search is a tool of intent, and mobile is a platform of immediacy. Mobile search traffic surges when people want to know about something happening now — from news, weather and sports events to celebrity gossip and watercooler stories.

For AMP-enabled publishers, all of that mobile Google traffic now flows to AMP pages. Since Twitter adopted AMP links in its mobile apps, Twitter traffic flows to AMP as well.

Plus, big stories invoke Google’s AMP-only Top Stories carousel, which funnels more traffic to AMP-enabled publishers.

Only AMP content is eligible for Google’s Top Stories carousel.

What does an AMP spike look like?

First, here’s the pageview trend for an ordinary story of limited, local interest (road closures). This story was consumed mostly in standard, non-AMP form. Most people probably discovered the article on the publisher’s website or social feed. The graph below shows traffic to the non-AMP and AMP versions of the article over 24 hours:

First 24 hours of pageviews to an ordinary (non-viral, non-breaking) news story. Source: Relay Media tracking pixel on standard article page and AMP page.

Now here’s a heavily searched story (celebrity death) that received 3.5 times more traffic to the AMP version than the standard version. An outsized share of AMP traffic is typical for stories like this. The graph below shows traffic to the non-AMP and AMP versions of the article over 24 hours:

First 24 hours of pageviews to a popular, heavily searched news story. Source: Relay Media tracking pixel on standard article page and AMP page.

AMP delivers speed when it matters most

Fast page load is particularly important during breaking news, and particularly noticeable in search. People seeking information and comparing stories from multiple sources will notice whether they’re getting a quick, smooth page or a slow, clunky one. Big stories are a publisher’s chance to satisfy (or frustrate) new and existing users; AMP can help ensure these defining encounters are positive.

The stakes are higher during widespread emergencies when people use their phones to find weather updates, evacuation routes, traffic updates and shelter information. Overloaded mobile networks make slow websites even slower — or totally inaccessible. At these times, a fast-loading mobile page (AMP or not) can be a public service, not just competitive advantage.

Key questions for publishers:

Knowing that AMP skews toward viral and breaking news, here are some questions for publishers to consider:

Are you AMP-enabling your best-performing search content, particularly breaking news, sports and weather?How does your AMP experience support your conversion funnel — whether your goals are subscriptions, alert opt-ins, app downloads or just another pageview or session?Do your AMP pages reflect your brand and give a positive first impression? Do they look like a stripped-down version of the standard page, or do they invite engagement and showcase your most sticky features?Do your revenue operations allow you to fully monetize AMP spikes?Eventually, would you consider publishing your article pages as canonical AMP so that all users get the same fast experience regardless of how they encounter your content? What are the metrics and requirements you’d need to make that decision?

By the way, Google picks up the tab

Publishers might not realize that when AMP pages are viewed in Google’s environments, they’re served from Google’s CDN — so Google pays to deliver the code and images. This is a side effect of Google’s practice of caching AMP pages in order to validate them and deliver them with reliable speed.

This does not include video, which Google does not cache and which can be the most costly element to serve. Still, the Google AMP cache could put a dent in publishers’ bandwidth bills as mobile traffic grows, especially when AMP stories spike through the roof.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Low-hanging fruit: 5 fast & simple SEO tactics to harvest tons of organic traffic

Most people new to SEO are looking for a silver bullet — that one magical tactic that solves all of their problems. The bad news is that it doesn’t exist. The good news is that there are many tactics that, while not magical, are relatively simple to execute and offer a high return on investment.

We all want to get the most out of the time, effort and money we invest into SEO, and today I’m going to help you do exactly that by identifying low-hanging fruit — the SEO tasks you’ve probably overlooked or simply haven’t updated in a long time. These are simple tasks that, once properly executed, can have a significant positive impact on your organic traffic.

1. Revise URL structure

Once upon a time, a flat URL structure was preferred from an SEO perspective, but just as most of us today don’t think the world is flat, a flat URL structure is outdated SEO thinking. A logical hierarchy helps search engines to better understand exactly what your website is about.

A simple way to understand the role of website hierarchy in SEO is to imagine the individual pages within your website as a series of nested buckets.

Your home page is about your core topic and is the largest bucket. It contains all of your second-level pages. Nested under each of those second-level pages would be any third-level pages and/or blog posts related to your second-level pages. This is a total of three levels, which should be plenty for most websites; however, four levels may be reasonable in some rare circumstances.

In the old days, this hierarchy meant creating a series of actual folders on your server, but with modern content management systems like WordPress, it’s simply a matter of using a categorical structure and configuring your permalinks properly. The URLs are then rewritten dynamically via your .htaccess file. Don’t worry if you’re not the ultra-technical type — WordPress handles all of this rewriting wizardry for you.

The first step is to change the default permalink settings. Then, if you haven’t already, publish your second-level pages, and create corresponding blog categories. The slugs for your categories must exactly match the slugs for your second-level pages. This seemingly minor detail is critical because it determines how search engines will value each page within your website relative to other pages within your website.

Once properly configured, each third-level page and blog post will appear as a sub-page of the applicable second-level page based on the blog category they are assigned to. In other words, each third-level page/post adds more authority to the page it appears nested under.

2. Prune low-quality/low-traffic pages

Search engines will only crawl a finite number of the pages in your website, so to ensure that they regularly crawl your most valuable content, it’s vital to prune low-quality pages and pages with little or no traffic.

This could include:

pages that are no longer relevant to your business.multiple pages that can be merged into one.pages that don’t add any real value (For example, groups of very similar pages, such as pages for Tampa contractors, Clearwater contractors and St. Petersburg contractors, which differ primarily by a change in city names.)outdated blog posts.thin content (My rule of thumb is anything less than 500 words, but 750 or more is better).pages targeting very low-volume keyword phrases.

With the garbage out of the way, the overall quality of your website will improve; search engines will be more likely to crawl your higher-quality pages, which means they will find your updates sooner.

Don’t just delete these pages, though, because then you’ll lose any equity you’ve built with them. Instead, set up a 301 redirect to send both search engines and visitors who try to access them to a different but relevant page on your website. If your website runs on WordPress, there are several plugins that can help you accomplish this, but I prefer to create redirects in the .htacess file because any plugin you add can potentially reduce the speed of your website while opening it up to vulnerabilities from hackers. This method is cleaner, more efficient and more secure.

Before you get started, it’s important to make a backup of your .htaccess file, because a single error can crash your entire website. Once you’ve done that, open the file in your FTP program and add the following line, modified to reflect to the URL you’ve deleted and where you want visitors who try to access it sent to.

Redirect 301 /old-page-url/ https://your-domain.com/page-url-redirected-to/

Note: You might need to change the settings in your FTP program in order to see your .htaccess file because it’s treated differently than standard files.

You can generally create as many redirects as you need to, one per line, but it’s important that all redirects go directly from point A to point B. I recommend reviewing them at least once per year — more often if you have multiple people working on it — to look for any that include multiple redirects. (A to B to C should simply be A to C, for example.) Screaming Frog is a great tool to do this, and they even have a free version that will crawl up to 500 URLs, though I do recommend investing in the paid version, which removes this limitation.

3. Improve page speed

Page speed is an essential aspect of modern SEO, and it’s an area in which most websites perform poorly. Fortunately, it’s also an area that offers a tremendous return on investment because it’s relatively easy to make significant improvements with minimal effort, time or money.

The first thing you need to do is get away from the commoditized discount web hosting. Nearly every reputable web host today offers hosting packages specifically optimized for WordPress, and most of them offer SSL, caching and CDN at costs that aren’t much higher than the slower-than-mud shared hosting accounts most people use. Switching to one of these specialized web hosts will generally give you the biggest overall improvement in page speed compared to anything else you can do.

Reducing the number of installed plugins will usually give you the next biggest improvement, because each plugin requires a little bit (or in some cases, a lot) of processing power on each page load, and many of them load one or more images, scripts and CSS files. This all adds up pretty quickly. I was recently part of a project where a contractor who did not possess programming skills used 42 different plugins to accomplish what could have been done far more efficiently, simply using PHP and JavaScript. This resulted in a website where the average page was 1.55mb, required 56 http requests, and took 4.8 seconds to load

This same thinking applies to all images, JavaScript and CSS files that are a part of your website. The fewer http requests, the faster your website will load.

Finally, make sure your media is properly sized and optimized. Images and video can have a significant negative impact on page speed — especially the way most people upload them. The average person doesn’t think about the dimensions or file size of an image; they only think about the fact that they want to take that gorgeous photo from their iPhone and use it on their website. They don’t realize, however, that the image is significantly larger than it should be for their website, so they simply upload it, resulting in a dramatic reduction in page speed.

This isn’t surprising considering that a photo from the typical modern smartphone can be up to 40 times larger than it needs to be for use on the web. If you include just a few of these unoptimized images on a page, you can drastically slow down your site — especially for mobile devices.

Here are some tips to optimize your media:

Images

Choose the proper format. JPG is best for most photographic images, while GIF or PNG are usually for images with large areas of solid color.Properly size images. If an image is displayed at 800px wide on your website, there is no advantage in using a 1600px wide image.Compress the image file. In addition to being the top image editing program, Adobe Photoshop has amazing image compression capabilities and starts at $9.99/month. There are also free WordPress plugins like Imsanity, EWWW Image Optimizer, and TinyJPG that will automatically compress the images that you upload.

Video

Choose the proper format. MP4 is best in most cases because it produces the smallest file size.Serve the optimal size (dimensions) based on visitors’ screen size.Eliminate the audio track if it’s not necessary (for example, when a video is displayed as a background element).Compress the video file. I use Adobe Premiere most of the time, but Camtasia is a solid choice, too. There are also free tools online, like ClipChamp’s video compressor.Minimize the video length.Upload videos to YouTube and/or Vimeo and use their iframe embedding code.

4. Build (and prune) internal links

Internal links can play a valuable role in SEO, both for purely technical reasons, and because of the positive impact they have on user experience.

From a technical perspective, internal links help search engines find more of the pages within your website and understand which pages are most important. From a user experience perspective, they help people find the content that answers their questions while keeping them on your website longer.

This is a delicate balancing act, though. It’s important to have enough internal links to make an impact, but it’s equally important to not overdo it — add too many, and you can bog down search engine crawlers and annoy human visitors.

You can do this manually (and in some cases, you may want to), but there are a few handy WordPress plugins that enable you to dynamically control internal links from a single page in your admin area (I generally use SEO Smart Links). This makes adding, editing and deleting internal links fast and easy.

Your first step is to include internal links from any relevant pages to all of the pages in your top-level navigation that cover your products and/or services. You should also include internal links to any of your pages or posts targeting high-traffic keywords and, to a lesser degree, supporting keywords. There’s not a magic formula or ratio, but if you limit it to one internal link every few paragraphs, you generally should be fine.

Since you’ll be pruning low-quality and/or low-traffic pages, you’ll also need to prune any old internal links to those pages. Now would also be a good time to prune any broken links to external websites. Once you’ve pruned these pages, and both internal and external links, it’s a good idea to crawl your entire website with Screaming Frog to ensure you aren’t linking to any nonexistent pages.

5. Tap existing relationships for high-quality inbound links

Anyone reading Search Engine Land knows the value of high-quality links in SEO today, and you probably also know just how difficult it is to earn those links. Quality links take tremendous time and effort to earn, which means that building quality links is not a scalable task. While that may sound like a bad thing, it’s actually an advantage, because most of your competitors will either use tactics that produce ineffective, low-quality links or even give up on link building entirely.

Unlike cold emails to random website owners begging for them to link to your content, an email to someone you already know is far more likely to be opened, to be read and to produce the desired outcome. But that doesn’t make it any less important to produce amazing content — after all, what good is it to earn organic traffic if all of your visitors click the back button shortly after arriving?

It’s also still important to seek only relevant links, because while a friend would probably be willing to throw you a link even when it’s not relevant, those types of links are unlikely to have any positive impact on your ranking — and may even harm both your website and your friend’s website if the algorithm detects a pattern of this type of activity.

If you’re just getting started and don’t have any relationships to leverage, the answer is to start building them right away. I wrote a step-by-step article on exactly how to do that, titled “The role of traditional public relations in SEO,” that should help immensely.

Anything else?

I’ve identified five simple tactics that anyone can use to improve their SEO results today, but maybe you have a few I didn’t include. If so, or if you have any feedback or questions, let me know on Twitter, and be sure to tag Search Engine Land, too.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.