Google launches new Rich Results testing tool with some rebranding

Google has announced it has launched a new version of a structured data testing tool for rich results at https://search.google.com/test/rich-results.

The company also said it will be calling rich snippets, rich cards or enriched results “Rich results” from now on and group them all together.

Google said the new testing tool “focuses on the structured data types that are eligible to be shown as rich results.” This new version enables you to test all data sources on your pages, including the recommended JSON-LD, Microdata or RDFa. Google said this new version is a “more accurate reflection of the page’s appearance on Search and includes improved handling for Structured Data found on dynamically loaded content.”

The tool currently only supports tests for Recipes, Jobs, Movies and Courses. Google said it will be adding support for other rich results over time.

Here is a screen shot of the tool. Note it works on desktop or mobile:

You can check out the new rich results testing tool over here.

Google warns webmasters not to use misleading event markup

Google is warning publishers and webmasters not to use event markup in a way that is misleading to searchers, or else Google will remove the ability for your whole website to show rich snippets in its search results.

Google said it has recently updated and clarified the guidelines around the use of event markup after they received a lot of feedback around the misuse of that markup. Specifically, Google is calling out publishers in the coupons/vouchers space as marking up their offers with event markup. “Using Event markup to describe something that is not an event creates a bad user experience, by triggering a rich result for something that will happen at a particular time, despite no actual event being present,” Google wrote.

Here is an example of such misleading rich snippets:

If you do this, Google said it “may take manual action in such cases.” A manual action is when a human at Google marks your website as doing something against the Google guidelines. Normally, it results in ranking demotion or delisting but “it can result in structured data markup for the whole site not being used for search results,” Google wrote.

If your site gets one of these manual actions, you will find a notification in your Search Console account. From there, you can take corrective action and submit a reconsideration request.

Google has been penalizing for spammy structured markup for a few years now, but clearly, Google is going to step up action around event markup spam soon.

How machine learning levels the SERP playing field

We don’t ordinarily think of Google when we think about competition in the digital marketing world, since it seems to reliably dominate most areas in which it does business. A recent segment discussing corporate monopolies on John Oliver’s “Last Week Tonight hilariously referenced Bing as the dominant search engine with a graphic that stated, “Bing. The best place to Google something.”

For the most part, however, the digital marketing sphere has been a fairly competitive landscape, though there were exceptions to this maxim. Established brands frequently dominated top SERP positions because of long-standing trust, fresh domains had to wait their turn in line, and black-hat SEO allowed webmasters to game the system and deliver high rankings for thin content. A decade ago, SEO agencies and webmasters could apply simple heuristics and buzzworthy keywords to rank content regardless of its utility to user intent or actual quality.

The Hummingbird update and subsequent rollout of RankBrain changed all of these notions entirely.

They should also be changing SEOs’ ideas of how to achieve success. Though many SEO experts understand the importance of RankBrain, or at least how important it will be, they still employ conventional strategies we made a living off of a decade ago.

In this column, I’ll explain why you should remodel the way you look at search engine optimization. And I’ll also offer some advice on machine learning applications and SEO strategies you can employ to compete in the cutthroat SEO landscape.

How machine learning revolutionized search

Machine learning is a subset of artificial intelligence that allows computers to learn independently of human intervention, learning in iterations by grouping similar properties and determining values based on their shared properties.

Google’s RankBrain, which the company says is its third most important ranking factor, is applied to determine the context of new search queries that it has not received before. RankBrain distinguishes the context of unlearned searches by pulling semantically similar keywords/phrases and comparing them with similar past searches to deliver the most relevant results.

Google employs machine learning technology to find patterns and make sense of relevant data when it analyzes user engagement with web pages in its SERP listings. With this data, Google’s algorithm evaluates user intent. From Google’s perspective, this helps filter results more effectively and rewards users with a better experience.

Currently, conventional signals are still applied to rank the best results. With each subsequent, relevant search, machine learning can analyze which web pages are receiving the best user signals and provide the best results to meet user intent. It’s important to note that machine learning isn’t instantaneous but would result in slow ranking changes based on growing data from its SERPs.

This has two broad implications for keyword research and ranking:

    Keyword rank is no longer affected by dramatic shifts.Google’s algorithm is more dynamic; different algorithms are employed for each unique search.

In more competitive niches, content quality and increased user engagement will slowly take precedence over conventional signals, leveling the SERP playing field. In low-volume searches, conventional ranking signals will still be applied as the de facto standard until enough data is available to determine user intent.

This has also brought semantic search to the fore for SEO experts. Semantic search allows content to rank for multiple keywords and get increased traffic by meeting the intent of various related search queries. The clearest example of semantic search’s impact is the related search field at the bottom of Google SERPs and what “People Also Ask” below the featured snippet field.

As Google becomes capable of understanding human intent and linguistic intelligence, technical SEO and keyword usage will take a back seat to user signals. Considering different algorithms are applied to unique searches, links will be reduced in their role as the arbiters of content quality, and smaller domains will have a better fighting chance to compete against industry titans organically.

If searcher intent determines which algorithm will be pulled for SERP listings, how do we optimize and even track this? The answer involves using both conventional strategies and our own machine learning technology.

Give the people what they want

Here are a few methods SEOs should be using to keep current with the evolving environment:

1. Improve user experience

Searchmetrics’ 2016 report on ranking factors illustrated just how important user signals were to organic ranking. The company found that user signals were second only to content relevance in terms of importance.

One of the best ways that a search engine can determine user intent is by analyzing user signals, which it gathers through its Chrome browser, direct URLs, SERPs and so on. But Google’s most valued user signal remains CTR.

To ensure your web pages deliver good user signals, you must create a solid UX foundation. This means providing thematic continuity across your web pages, creating high-quality and relevant landing pages, using engaging images, offering interactive content, delivering fast page speed and developing an organized internal linking structure.

Metatags and rich snippets can also influence your click-through rate, so optimize for both. Google will obviously lower your rank if your website suffers from a low CTR in a high-ranking result.

Other considerations to keep in mind include:

employing 301 redirects for missing pages and rel=canonical tags for duplicate content.optimizing structured data and alternative tags to help search engines index content.resolving any broken links that could affect crawl structure.

Even though Google’s AI and RankBrain are incredibly advanced, Google still needs your help to crawl web pages and index them. It doesn’t hurt that these factors also improve your website’s navigation and user experience.

2. Embrace thematic continuity

Despite all of these advancements in search, I still commonly encounter clients who operate their websites with thin content and no keyword focus. My team begins client campaigns with research on keywords, competitors and some technical aspects.

Recently, though, we began focusing on creating more seamless hierarchical structures that leverage semantically linked keywords and topic clusters to promote an awesome UX. As opposed to simply creating content with a limited keyword focus, we focused on ranking our clients’ most important pages.

HubSpot refers to this exciting new practice as “topic clusters.” Topic clusters focus on pillar pages that represent your most important topics. These will be broad, overarching pages that rank high in your information hierarchy and attempt to discuss and answer the most important questions related to your main topic.

Subtopics are then discussed in greater detail on lower-hierarchy pages that contain internal links back to the pillar page. This strategy helps communicate your most important pages through a sophisticated interlinking structure, promotes seamless navigation and helps position your pillar page to rank for multiple keyword phrases.

These evergreen pieces are also supplemented by a consistent blogging strategy that discusses trending topics related to the website’s theme. Each piece of content produced is actionable and focuses on driving conversion or desired actions.

When modeling each piece of content, it’s important to ask yourself this question: What are the problems this piece of content is seeking to address, and how will it solve them? As more questions pop up, write content addressing these issues. Now you’ve created a website that satisfies user intent from almost every possible perspective. This helps you rank for a lot of keywords.

You can also employ machine learning technology to improve the workflow of your content marketing campaign. Applications, such as the Hemingway App and Grammarly, are excellent tools that can provide suggestions where improvements could be made in sentence structure, author voice and word usage.

3. Employ natural language

Perhaps the best way to optimize for an artificially intelligent search world is to optimize for voice search, as opposed to text search. This involves optimizing your website for mobile and your content to achieve featured snippets, given that answers to questions asked to a personal assistant device are pulled from the featured snippet field on a Google SERP.

In addition to following the strategies outlined thus far, this involves crafting cogent page copy that seeks to answer as many questions as possible and provide actionable solutions.

Research has also shown that people searching by voice, rather than text, are more likely to use search phrases from four to nine words in length. This means you need to optimize for long-tail keyword phrases — which are usually longer in length — and page copy that is more representative of natural language. For example, a text search for flights to Hawaii may be “cheap flights Hawaii,” while a voice search may say, “What are the cheapest flights to Hawaii?”

With the rise of machine learning, optimized content that appeals to natural language could satisfy user intent for both broad match searches over text and long-tail voice searches.

Consider how chatbot assistants incorporate Natural Language Understanding (NLU) to more readily understand linguistic syntax and meanings. With advancements in NLU applications, search engines will eventually be able to entirely assess the meaning and quality of content the same way a human does.

4. Personalize the buyer’s journey

With more big data being created this year than in the past 5,000 years, businesses will need to leverage machine learning technology to interpret vast amounts of user data at an unprecedented speed.

One way this is already being executed is by mining conversational text data from chatbots. As we move from a graphical interface world into a conversational interface, chatbots are being used to map inputs and data from customer journeys to help companies improve their user experience.

This technology is still in its infancy, but we can also apply machine learning technology and data mining to personalize touch points along the buyer’s journey. Customer journey mapping can be used to build out buyer personas and personalize marketing touch points to maximize conversions and sales.

Using customer journey mapping, businesses can personalize touch points to deliver content or advertisements when intent is highest. Real-time responses can be instituted to respond to customer service calls immediately, deliver calls to action to high-scoring leads and segment advertisement campaigns based on real-time data.

Predictive analytics can also be applied to deliver predictions of estimated campaign performances based on real-time data. This will greatly save time on A/B testing and improve campaign efficiency.

Fortunately, machine learning technology can be used by anyone. Given the sheer speed and scale of machine learning applications, relying on conventional SEO strategies to rank organically may eventually put you at an incredible competitive disadvantage.

The future is already passing

Don’t worry, automation won’t totally displace humans any time soon. Machine learning technologies can help augment marketing campaigns, but the creative and execution ultimately rely on the expertise of human intelligence. But we will probably reach a point soon enough that clients will actively seek out digital marketing firms that have expertise in customer journey mapping and AI-enabled applications.

In my opinion, these technologies have the potential to greatly improve the competition for SERPs and will also allow digital marketers to deliver a stronger product.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Google Image Search adds badges for recipes, videos, products and animated images

Google announced they now will be showing badges on some images within their image search interface on mobile. Adding badges to images helps searchers understand any data behind the image that Google may have.

Google said, “These badges will help you uncover images where next steps or more in-depth information is available — everything from bags to buy, to recipes to try.” The badges currently available include recipes, videos, products, and animated images (GIFs).

Here is an animated GIF of it in action:

These badges are powered off of rich markup schema that you apply to your images on your web pages. To learn more about how to get badges to show up in Google Image search for your website, read the Google Webmaster blog. The Structured data testing tool has already been updated to help you check your images’ markup.

Google Search Analytics report now breaks down job listings & details results

Google quietly announced they have added the ability to filter by job listings to the Search Analytics report within Google Search Console. If you have job listings on your website, you can potentially now filter your Search Analytics report to see what traffic, impressions, clicks, positions and so on those listings are bringing to your website.

Here is a screen shot of the filter, with added emojis Google shared:

This filter is available under the “search appearance” section in the Search Analytics report. You can see a way to filter by job listings or job details.

Google recently opened up job schema to all, so they can markup the job listings on their website to take advantage of the new Google for Jobs opportunities.

Google publishes new FAQ on job search postings for webmasters

Mariya Moeva from Google posted a new and helpful frequently asked questions document in the webmaster help forums around the new job search functions in Google.

Google is encouraging webmasters to mark up their job listings so that Google can show them in web search for job-related queries. Yesterday, Google published this helpful FAQ around this topic.

Here is a copy of the FAQs for job search postings for webmasters:

Q: Why aren’t my jobs appearing in this feature? As with any other structured markup feature in Search, having markup doesn’t guarantee appearing in the Search results. To debug any issues that are related to the markup implementation, go through the following:

Validate the markup in the Structured Data Testing Tool.Check that your sitemap has been crawled and does not contain any errors. Sitemaps need to be accurate and correct in order to be processed.Go through your Rich Card Report in Search Console to check if there are any potential issues with your markup.

Q: How do I check how many jobs are indexed? Use the Rich Cards Report within Search Console.

Q: Should we put the markup on the canonical or mobile page? Markup should be placed on all pages, not just the canonical link.

Q: Can we include markup on our job listing pages? No, job listing pages should not reference any job posting markup.

Q: Can we include listing pages on our sitemap? We strongly recommend that only job leaf pages are included in the sitemap. If there are job listing pages in your sitemap, please ensure that no job postings markup is included on these pages.

Q: What does the ISO 8601 format look like for the tag in the sitemap? The format for date times must follow the following convention: YYYY-MM-DDTHH:mm:ss±hh:mm. Example: 2017-06-15T16:16:16+00:00

Q: Can we append our URLs with a tag or special attribution parameters? No. Only canonical links should be provided.

Q: Can two markup formats (e.g., JSON and Microdata) be used simultaneously on the same page? Although both formats are equally accepted, we prefer that only one format is used within each page to prevent any conflicting information between markup blocks. That being said, our team does prefer that the markup is implemented in JSON.

Q: Should we simply remove a sitemap from a sitemap index file as soon as there are no valid job posting URLs available? It is best that initially we are provided an empty sitemap file prior to removing references to the sitemap from your sitemap index file. This way, once we receive the next ping of your sitemap index we are able to properly detect markup removals for job posting pages. Once this has been done, the empty sitemap file can be removed from the index the following day.

The battleground of entities & reviews

For anyone familiar with my articles, you’ll know I like to write a lot on a couple of specific topics:

    EntitiesThe future of search

Today, we’re going to look at an area where both apply: reviews.

In this article, we’re not going to dive into specific strategies for acquiring reviews, as those change over time (though I will be linking below to a couple of fantastic pieces that cover well some current approaches). Instead, we’re going to look at why reviews are important and how Google looks at them — and likely will be looking at them in the months and years to come. We’re going to be looking at business reviews, obviously, but we’re further going to consider reviews of specific products and similar areas.

What are ‘entities’?

Before we get to any of the above, we need to cover what an entity is to really start to wrap our heads around how they play their role. If you’ve not yet heard of entities as they relate to search algorithms, they are defined by Google as follows:

[A]n entity is a thing or concept that is singular, unique, well-defined and distinguishable. For example, an entity may be a person, place, item, idea, abstract concept, concrete element, other suitable thing, or any combination thereof.

This seems like a fairly straightforward concept, and it is. Essentially, an entity is a thing. It may be a specific person, like “Danny Sullivan,” or it may be a singular and defined idea, like “evolution.”

While simple, the impact of entities on search is massive — and it’s sadly one of the most overlooked areas of discussion in SEO. So today, we’ll take steps to remedy that in at least one area.

Let’s talk about reviews

We’re going to begin our discussion in an area we all tend to think of when we think of reviews…

Business entity reviews

From a search standpoint, it can be useful to think of your business the way the law does (if you’re incorporated, at least): it is a thing that is unique and autonomous. It may be connected with other entities, but it is not the same as them, nor does an adjustment of those connections necessarily impact the business entity itself (a business may change its CEO while changing very little, for example).

Let’s get a feel for how this all works — and since an image is worth 1,000 words, let’s look at a graphical representation of our business in Google’s eyes:

OK, perhaps this picture isn’t worth 1,000 words, but let’s assume this is your business. Now let’s add in some connections that are natural. Entities connected with your business will appear in dashed red circles, and blue arrows will establish the relationships between these entities.

Now we’re getting started in illustrating how entities work. Your business entity is connected to other entities in ways that define many of its characteristics. If you want to simplify it, you can think of them like links to and from that entity. We’ll get a into that further below; for now, it’s enough to understand that a business entity is connected to other entities that define what that business is, where it’s located, who and what it’s connected to and so on.

Now, let’s add in some reviews in green dotted circles…

Now we can start to see how reviews fit into the picture. They’re not simply an unknowable ranking factor that’s “good because it’s good,” but rather a simple-to-understand addition to a business entity calculation. The more reviews you have, the more trusted the global review average will be — but further, the reviewers themselves are entities that factor in. In this area, we’re just starting to witness the first implementations of the entity status of the reviewer factoring in, but this will push forward dramatically in the coming months and years.

At this point, you may be asking what I’m referring to regarding the reviewer entity status. Great questions, hypothetical you! As was reported last week, Google has changed the way they display reviews for hotels on mobile to look like:

The key part here is the information related to the type of visitor (e.g., Families, Couples).  This requires taking in entity information related to the reviewer and adjusting specific review scores based on it. So let’s look at how that fits into our graph:

This is extremely limited in its scope to include only the number of reviews someone has done and their marital status — in reality, there would be dozens or hundreds of different connections.

With just this limited example, however, we can see that if the searcher is married, they are highly likely to enjoy their experience with Acme Business Entity, whereas a single person may not like it. These are the types of expressions of entity metrics we’re seeing presently in hotel reviews, but let’s flash forward a bit.

Dave and Bill have also done a lot of reviews compared with Jane’s 2, indicating they are less likely to be spammers and they understand how the review system functions. Inevitably, other areas of their own entity metrics will factor in, such as their other reviews and ratings, age, location and so on, and many of these will invisibly influence the rating system.

The idea that the algorithm will be adjusted to weight reviews from people with similar demographic or interest-based characteristics higher is not a big reach. In the example above, does it make more sense for me as a married guy reading reviews to see the total average of 3.6/5 or the adjusted average only considering people with characteristics similar to my own, which would yield a 4.5/5?

What we’re seeing with hotels is fine, but it isn’t broad enough in scope to hit the nail on the head across all sectors. It’s a proof of concept, and it’s interesting. But there is more to me than whether I’m solo or married, traveling for business or with my family — and to believe Google will not be taking this into account is short-sighted. And here’s why that’s great…

The vast majority of businesses could not (and should not) attain a 5/5 rating from every demographic. They cater to their audience, and that’s what they should do. A hipster restaurant with craft beer would suit me well now, but back when I was a starving student… not so much. Understanding who’s writing a review and what they expect and enjoy needs to factor in strongly.

This recent step with hotels makes sense, but it cannot possibly cover all the variables that would go into a review being fully applicable to me. Rather, Google can weight all the various entity information they have and come up with what they determine to be the most applicable reviews for me.

For example, let’s take a review for a Mexican restaurant and look at just a few characteristics Google might consider if I were personally searching. Some of my core characteristics include:

Male40sHas favorably reviewed Mexican restaurantsHas written and rated many locationsLives in Victoria, CanadaHas reviewed and rated various restaurants with mid-to-higher price points

Armed with this data, Google is going to know that when I’m looking up a Mexican restaurant in a new city, the rating given by a middle-aged person who tends to like good food and is willing to pay for it is going to be a lot more relevant than a review from a student who tends to hit up cheaper places to save money. Both may give a five-star review to different locations, but what they recommend is not equally applicable to me — and thus, their impact on reviews and the weight they pass to an entity needs to be adjusted.

Similarly, if both reviewed the same restaurant, and if that restaurant is known to have a higher price range, the review of the one known to visit and rate pricier locations should be weighted higher than the review of someone who may have their opinion skewed by feeling the pricing is too high (or they weight it more highly because they paid more for it, not because it’s actually good).

Flash forward in review evolution a bit, and these variables would appear in an equation that would look something like:

Rating Weight Adjustment = Gender * V + Age * W + Rated Mexican * X + Number Of Reviews * Y + Location * Z

In such a scenario, each factor is given a relevancy score (how relevant is gender to the enjoyment of Mexican food?) and then adjusted by machine learning over time to account for personal considerations and the wide array of other factors that would be taken into account on top of this very short list.

Let’s look at the following illustration (these weight numbers are examples and not indicative of what actually is in the algorithm):

We can get a feel for how much weight each of the factors has, with gender hardly impacting them at all and past ratings of Mexican restaurants factoring in heavily. Remember, we’re looking at a person here and the value of their reviews on my results. Rightfully, whether the reviewer is male or female would have very little impact on the weight of their review; however, their writing of past reviews of other Mexican restaurants, their age being close to mine and having written a large number of reviews would cause more emphasis to be placed on their review.

If I’m right, then in the near future we’ll see the review system change to place more weight on reviews where the reviewer is similar to the searcher, and where generic influencer scores will be placed on individuals (human entities). Furthermore, I would suggest it’s highly likely that not only will review weighting be adjusted as a result of personalization, but the actual search results themselves will be more personalized than they are today.

Thinking about products

I’m about to go out on a limb to discuss an area that I feel makes sense, but for which I’m just spit-balling. We’ve been talking a lot about the impact of reviewers on review weighting and relevancy of a site to a specific demographic. But I would suggest that the products a business carries — and how those products are reviewed — may well impact an entity’s overall prominence, too.

Let’s look at a simple example based on our second entity illustration above.

What I would predict we will see in the near future is that the reviews of a specific product, or “product entity,” will impact a business entity’s status if they sell that product (even if the review is from a different site). If a company were selling only products with low reviews across different sites, I would put forth that that business entity’s overall score would be diminished (certainly for queries related to those products or that product category).

One can think of this as tall-tale breadcrumbs. All of these products are understood to be under a specific hierarchy/category, and that category is understood to contain low-quality items (though, again, this could be adjusted based on reviewer demographics). And thus, the Acme Business Entity would be reduced in the value assigned to it for that category of products.

I need to stress, again, that at this time I have not seen any evidence of this. As I said above, I’m just spit-balling here. But if one simply thinks about an environment where Google wants to provide its searchers with results that will meet their needs — and assuming they have the information to connect the reviews of one product with another on a different site — it is a logical and beneficial angle to pursue.

So, what do you do?

We’ve covered a lot here about how entities and reviews can and likely will impact rankings and how review scores will likely be augmented further in the very near future to place more weight on those reviews that more closely match the searcher’s intent and interests. So let’s review what you need to pay attention to…

Who is reviewing you, and what their reviews are. You can’t please all of the people all of the time, but are you pleasing your target demographic? Be clear on your site or in your business who you are catering to and what they can expect.Which of the products and services you offer are being reviewed favorably and poorly across the web. This is simply a good business move (clearing out bad products and focusing on the good); however, if I’m right, and this will start to impact your own rankings and review scores, it will be more important than ever.How your site connects with other entities (e.g., authors in your blog, companies you’re affiliated with) and how they are rated. If you’re associated with poorly reviewed and rated entities, this flow of influence (or rather, lack thereof) will impact you.

In the end, the point is that we can no longer focus on simply how our business entity is reviewed but must look at how the entities it’s connected to are reviewed and who is doing that reviewing. We’re being forced into an environment where we need to look at our business as a whole, what we offer, who we partner with and who we cater to. While we need to respond to negative reviews as always, we need to be more conscious of who is doing the reviewing and whether they are part of our target demographic.

Resources

I promised above to link to some resources on how to get reviews and the risks involved, since we didn’t talk much about those specific strategies here.  Here are some of my favorite pieces on the subject:

Thomas Ballantyne Speaking On Reviews At SMX West — Search Engine LandInfo On The Consumer Review — Search Engine LandHow To Get Local Reviews & Rating — YoastCreate A Link For Customers To Write Reviews — Google

Conclusion

I hope that if nothing else, this article has given you food for thought. While a lot of this article is based on ideas not yet implemented, most are logical, and we’re starting to see some of the early signs that this is the direction things are about to take. Our job (yours and mine) is to be ready for these things when they come, and being ahead of the curve in understanding what’s happening will help us make business decisions that lead naturally to a better entity status for our companies. Fortunately, there is no downside to following the ideas listed above; it’s simply forcing us to understand the complexity (and simplicity) of the way Google approaches entities as outlined in their many patents on the subject and changes we’re seeing them make every day.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

Google announces similar items schema for image search on mobile

Google’s image search results on the mobile web and in the Android Search app are now showing “similar items” — i.e., if you’re looking at “lifestyle” images and click on one that you like, Google may show you additional product images from places where you can buy the item(s).

Julia E, product manager on Google Image Search, announced on the Google search blog that you need to use schema.org product metadata on your pages and schema.org/Product markup to make sure your products are eligible for inclusion on these image results. Specifically:

Ensure that the product offerings on your pages have schema.org product markup, including an image reference. Products with name, image, price & currency, and availability meta-data on their host page are eligible for Similar items.Test your pages with Google’s Structured Data Testing Tool to verify that the product markup is formatted correctly.See your images on image search by issuing the query “site:yourdomain.com.” For results with valid product markup, you may see product information appear once you tap on the images from your site. It can take up to a week for Googlebot to recrawl your website.

Here is a screen shot of the results:

This schema was actually introduced last December, but Google never announced it.

Google says this similar items search feature is available now for “handbags, sunglasses, and shoes and will cover other apparel and home & garden categories in the next few months.”

Google quietly expands rich cards worldwide

Google is expanding support for rich cards globally. Yesterday, Google updated the blog post that originally announced this feature back in May 2016. In this update, Google said rich cards are now available globally.

Originally, rich cards launched in the US just for movie and recipe websites. That has expanded over the months, both in terms the types of websites that are supported and what geographic versions of Google search support it.

Now, rich cards are fully supported globally. Google wrote:

In 2016, we launched rich cards in the US, creating a new way for site owners to present previews of their content on the Search results page. Starting today, sites all over the world can now build rich cards across Google Search.

By building Rich Cards, you have a new opportunity to attract more engaged users to your page. Users can swipe through recipes in the UK from sites like BBC Good Food or browse movies in Mexico from Cinepapaya or view restaurants in Germany from Prinz.de. Also, rich cards support the AMP format. So if you build AMP pages, users will be able to swipe near instantly from page to page.

Google has not written or announced this expansion on any of their channels yet. We found this thanks to a tip from @ShobhitSaxena22.

Google reviews schema guidelines to prohibit vulgar or profane language

Google has changed the guidelines for their reviews schema to prohibit the use of schema for reviews that contain profanity and vulgar language.

The new guidelines section added a line that reads, “[P]rofanity and vulgar language are prohibited. Do not include reviews that contain vulgar or profane language.”

This means that if you mark up your reviews with schema for Google and some of the reviews contain profanity and vulgar language, you need to remove those reviews from your website or remove the schema.

Since reviews are left by your users, it can mean that you need to make sure to adapt your internal quality controls to ensure old and new reviews do not contain such language.

If you do allow profanity and vulgar language in your reviews, Google reserves the right to remove your rich snippets from the search results.

This was first spotted by Aaron Bradley.