Manhattan start-up Immersive Labs is introducing artificial intelligence software created to analyze viewers of digital billboards in order to customize “smarter” messages with targeted ads. The system takes into account age, gender and facial features of passers-by as well as environmental conditions and online data (on a cold day, for example, imagine targeting ads for a hot cup of coffee at a nearby Starbucks). The company has already tested the ads in New York’s Sony Style Store, and has plans for a Hudson News Kiosk in John F. Kennedy Airport.
Facial recognition is not new, nor is the ability to detect the composition of a crowd, but Immersive uses the information to deliver targeted advertising based on these characteristics teamed with online information (such as whether a nearby sporting event has recently concluded). Additionally, the software considers other local data including weather conditions and social media updates from sites like Twitter. It also measures how long someone looks at the billboard. According to Immersive Labs, by collecting data the software actually “learns” and improves over time.
For those worried about the “creepiness” factor, Immersive CEO Jason Sosa explains that mapping facial recognition is strictly anonymous. “We take privacy very seriously,” he says. “The information we’re collecting is purely numerical. It’s nothing that’s going to be identified to any one individual person.”
Immersive Labs emerged from TechStars, a mentorship-driven seed stage investment program.
Related CNN Money 2-minute video report: “These ads know exactly who you are” (4/13/11)
Related Huffington Post 2-minute video report: “At Immersive Labs, Ads Watch Who Looks At Them” (4/26/11)
Related Network Advertising Initiative study: “Study Finds Behaviorally-Targeted Ads More Than Twice as Valuable, Twice as Effective as Non-Targeted Online Ads” (3/24/10)