Google Lens an impressive start for 'visual search'

Google Lens has gone live or is about to on Pixel phones in the US, the UK, Australia, Canada, India and Singapore (in English). Over the past couple of weeks, I’ve been using it extensively and have had mostly positive results — though not always.

Currently, Lens can read text (e.g., business cards), identify buildings and landmarks (sometimes), provide information on artwork, books and movies (from a poster) and scan barcodes. It can also identify products (much of the time) and capture and keep (in Google Keep) handwritten notes, though it doesn’t turn them into text.

To use Lens, you tap the icon in the lower right of the screen when Google Assistant is invoked. Then you tap the image or object or part of an object you want to scan.

As a barcode scanner, it works nearly every time. In that regard, it’s worthy and a more versatile substitute for Amazon’s app and just as fast or faster in many cases. If there’s no available barcode, it can often correctly identify products from their packaging or labels. It also does very well identifying famous works of art and books.

Google Lens struggled most with buildings and with products that didn’t have any labeling on them. For example (below), it was rather embarrassingly unable to identify an Apple laptop as a computer, and it misidentified Google Home as “aluminum foil.”

When Lens gets it wrong it asks you to let it know. And when it’s uncertain but you affirm its guess, you can get good information.

I tried Lens on numerous well-known buildings in New York, and it was rarely able to identify them. For example, the three buildings below (left to right) are New York City Hall, the World Trade Center and the Oculus transportation hub. (In the first case, if you’re thinking, he tapped the tree and not the building, I took multiple pictures from different angles, and it didn’t get one right.)

I also took lots of pictures of random objects (articles of clothing, shoes, money) and those searches were a bit hit-and-miss, though often, when it missed it was a near-miss.

As these results indicate, Google Lens is far from perfect. But it’s much much better than Google Goggles ever was, and it will improve over time. Google will also add capabilities that expand use cases.

It’s best right now for very specific uses, which Google tries to point out in its blog post. One of the absolute best uses is capturing business cards and turning them into contacts on your phone.

Assuming that Google is committed to Lens and continues investing in it, over time it could become a widely adopted alternative to traditional mobile and voice search. It might eventually also drive considerable mobile commerce.

[Podcast] The Google I/O 2017 recap: Lens, Assistant & more

The Google news was fast and furious last week, with numerous announcements coming from its annual developers’ conference, Google I/O 2017. The latest episode of our Marketing Land Live podcast offers a look back at some of the bigger announcements that will impact online marketers.

Much of the focus was on Google Assistant, the company’s smart/virtual assistant that originally powered Google Home devices and has since expanded to Android phones and — as of last week — the iPhone, too. Google made a number of updates to Assistant’s capabilities, including an interesting tie-in with yet another new Google product called Google Lens. That’s an AI-powered visual search tool that turns your smartphone camera into a pretty powerful search box.

We have audio explaining these new developments directly from last Wednesday’s Google I/O keynote, featuring Google CEO Sundar Pichai, along with Scott Huffman and Rishi Chandra.

This episode runs a little more than 15 minutes. You can listen here or use the link below to subscribe via your favorite podcast service.

We invite you to subscribe via iTunes or Google Play Podcasts.

Show Notes

Google announces Google Lens, an AI-powered visual search tool

Google confirms “Google for Jobs” job search rolling out in coming weeks

Google Assistant comes to iPhone, adds alerts, hands-free calling & more

Google announces Assistant app-discovery channels, broad ranking factors

Google Actions are now available to smartphone app developers

Google I/O Roundup: Search-by-picture Google Lens & other Google Assistant features announced

Google adds more ads to Play Store, and other app ads news from I/O

Live Blog: Google talks AR & VR at Google I/O 2017

Web browsing coming to VR headsets, AR coming to the web via Chrome

Google pushes AMP Ads adoption, announces Celtra & MOAT integrations

Thanks for listening! We’ll be back soon with another episode of Marketing Land Live.

[This article originally appeared on Marketing Land.]

Google announces Google Lens, an AI-powered visual search tool

In today’s Google I/O 2017 keynote, the company is touting its advancements in machine learning and artificial intelligence — one of which is a new visual search tool called Google Lens.

Google CEO Sundar Pichai described Lens as “a set of vision-based computing capabilities that can understand what you’re looking at and help you take action.” The tool, he said, will initially be available in Google Assistant and Google Photos (both of which received several other updates, too).

In one example, Pichai showed Lens being able to identify a flower from a smartphone’s camera and offer additional information on the flower like you’d find in a Knowledge Panel. In another, he took a photo of a restaurant and Lens was able to pull up business details like you’d find via a Google Maps search — phone number, star ratings and more. Later in the keynote, Google’s Scott Huffman showed Lens working in tandem with Google Assistant. After taking a photo of a theater/club marquee showing an upcoming performance, Lens and Assistant were able to identify the band listed on the marquee and offer an option to buy tickets to the show on Ticketmaster.

On first glance, Lens is reminiscent of a 2009 technology called Google Goggles that offered the ability to do searches based on a smartphone photo. But Goggles was mostly just for identifying something; Lens has the ability to not only identify what’s in the photo, but also to give the added context of, for example, a restaurant’s phone number, its rating score and so forth (as shown in the GIF below that Google tweeted during the keynote).

With Google Lens, your smartphone camera won’t just see what you see, but will also understand what you see to help you take action. #io17

— Google (@Google) May 17, 2017

We’ll have much more coverage coming soon from Google I/O. You an also catch up via our Google I/O 2017 keynote live blog on Marketing Land.