Snap is opening up a potential new line of revenues by allowing Snapchat users to “swipe up” for more information about the location or contents of a post, providing links to maps, ride-hailing services, local recommendations and restaurant bookings.
The new feature, which Snap is calling “context cards”, brings many of the opportunities associated with Google-style search queries to Snapchat’s photos and videos, without users having to type text on their smartphone’s miniature keyboard.
Search is one of the internet’s most lucrative business models. Context cards could help convince investors that Snap can be more than just a messaging app and become a platform for discovering new places, products and services.
“It’s a way to learn more about snaps that you’re viewing,” said Evan Spiegel, Snap’s chief executive, in an interview at its headquarters in Venice, Los Angeles. “We showed how communication can be made so much more engaging and fun if it’s visual. Now we believe that people want to explore the world and learn about things in a way that’s visual-first.”
To begin with, Snap is providing context cards about a post’s location — for instance, if a friend has tagged a restaurant or store with a filter, or through users browsing posts by strangers on its recently launched Map feature. Partners include ride-hailing services Uber and Lyft, restaurant booking apps OpenTable, Resy and Bookatable, and travel tipsters Foursquare, Michelin guides and TripAdvisor.
Commercial terms of these partnerships were not disclosed, but the feature could eventually add new kinds of transactional revenues to Snapchat’s primarily advertising-driven business model.
“Anytime you can connect people with what interests them and connect them with more information and more opportunities, there’s usually a business there,” Mr Spiegel said. “But it’s just so early, I think there’s a lot of work to do first.”
With its shares trading around 15 per cent below its initial public offering price of $17, Snap is under pressure from Wall Street to find new sources of growth. While revenues increased 153 per cent year-on-year in its most recent results for the quarter ending in June, its quarter-on-quarter increase in daily active users was 4 per cent, slower than some analysts had expected.
Silicon Valley’s past attempts at creating a camera-driven search app, such as Google Goggles, have failed to become anything more than novelties, even if they did demonstrate impressive image-recognition technology. Nonetheless, it remains an area of great interest for tech companies.
Last year, Amazon added Rekognition, an image detection and recognition service, to its AWS web services platform. In May, Google said its forthcoming Lens service would be able to recognise objects and locations, providing extra information about them.
Rather than forcing users to take a new photo of something they want to learn more about, Snapchat’s context menu will provide extra details for pictures and videos that others have already taken. Its users post more than 3bn “snaps” every day, giving Snap plenty of data on which to train computer-vision algorithms that could one day automatically provide context for all kinds of products and services that are shown in an image.
“This experience is really only possible if you’ve created an ecosystem where people feel comfortable creating a huge amount of content and where people feel comfortable expressing themselves,” Mr Spiegel said. “We are just at the beginning of powering that experience. Over time, we’ll get better and better at providing more information and more experiences based on what you see.”
Get alerts on Snap Inc when a new story is published