APPS

Jackrabbit Mobile showing off facial-recognition for retail during ATX Startup Crawl

Posted October 6th, 2016

Do you smile when you shop?

When you approach a cashier to pay for something or ask for help, does your face betray exasperation, anger or impatience?

An Austin app-development company, Jackrabbit Mobile, has a new product it's testing out, Face Lab, that it hopes will help retailers and other kinds of business learn more about customers and clients by scanning faces, either in a live video environment or from photos.

The app that the company developed, says chief creative officer JoJo Marion, can determine from facial recognition a person's gender, approximate age and emotional state. That information, he said, can be useful for retailers to learn about their customer service or to improve the in-store experience when connected with sales information.

"For ecommerce and retail, analytics are a huge part of the business. Why isn't it big for the in-store experience," Marion asked, "why isn't anyone focusing on it and why aren't there more solutions?"

ContributedJackrabbit Mobile's "Face Lab" app uses facial cues to determine the sex, age and mood of customers, such as shoppers at retail outlets.

Jackrabbit will be showing off Face Lab as part of ATX Startup Crawl on Thursday night, Oct. 6 at its new East Austin headquarters on 1620 E. 7th St., the former home of Division 1 Bicycles.

It will also be showing off some virtual reality and augmented reality apps, including a "Cornhole" game (think bean-bag toss) that uses a phone to swing as a virtual object aimed at augmented-reality targets.

Contributed by Jackrabbit MobileJoJo Marion is chief creative officer of Austin's Jackrabbit Mobile.

For Face Lab, the Jackrabbit team spent a few months playing around with Google's then-new Cloud API, experimenting with image recognition, then using Microsoft's Cognitive Services API to add emotion analysis to the project.

Gibran Gaytan, a developer who's been working on Face Lab, said the app's ability to detect emotional cues as well as things such as whether someone has a beard or mustache, is remarkable given that it's not always an easy task to figure out what people are feeling from visual cues. "Some people don't express emotion much," Gaytan said, "whether they're happy, sad or angry."

In a quick, informal test of the Face Lab, the app guessed that I was male (correct), 67 percent likely that I'm 36-40 and 33 percent likely that I'm 41-50 (I'm 41, but in fairness to Face Lab, I do try to maintain a youthful glow). It determined that my mood was 50 percent happy, 50 percent neutral, which was pretty accurate the day of my visit to Jackrabbit.

Contributed by Jackrabbit MobileGibran Gaytan is a developer at Austin's Jackrabbit Mobile.

The app uses some of the same technology Microsoft promoted last year on a website, "How-old.net" that could guess a person's age from a photo. It was tied in to the movie "The Age of Adeline."

Marion said that Face Lab might be ideal for smaller businesses that don't have access to big data about its sales and that are looking for insight into how they're doing with customers.

"From security camera footage, you can start to tell, do customers walk in happy and walk out sad? Do they walk in neutral and walk out happy?" Marion said.
It could also be combined with other technologies, such as 360-degree video capture, to sense the mood and demographics of crowds at, say, a music concert.

You can check out a video about Face Lab on YouTube or check out a live demo as part of Startup Crawl.

Comments