See Smarter: OTG Platforms Announces Gesture-Based Interface for Smart Glasses


Over the last couple of weeks, the conversation about wearables has shifted from our faces to our wrists. Samsung unveiled a new batch of Gear smart watches, while Google’s Android Wear platform will be making its way to store shelves via offerings from Motorola and LG, to name a few. But what about smart glasses, that most visible signifier of wearable computing’s potential?

OnTheGo Platforms, or OTG, is a Portland, OR-based mobile tech startup that’s been working on an interesting new way for users to interact with their smart glasses. Instead of being tied to little buttons on the device’s side or relying on voice commands, OTG’s Augmented Reality Interface, or Ari, allows users to control their smart glasses via hand-gestures. The promotional video makes the process look a lot like controlling an Xbox Kinect—but with hopefully much better results:

Today, OTG announced the availability of Ari’s beta software development kit, and the company hopes for smart glasses app developers and device makers—like Google, Epson, Samsung, Recon, and Vuzix, for starters—to grab the Android-based SDK and get started on integrating it into their products. We spoke with OTG co-founder Ty Frackiewicz about Ari, the state of the wearables market, and the startup’s plans to get us all to wave our hands in the air like we just don’t care.

BestTechie: What motivated you and OTG to focus on smart glasses? What is it about that wearables category that seemed like the right place to put your efforts?

Ty Frackiewicz: We wanted to build an application for smart glasses, but the software and hardware wasn’t quite there. We want to see this space succeed, so we set out to build a platform to help make it easier for app developers to create and distribute robust applications. The first piece of this problem that we decided to tackle allows for a          radical redesign of how we interact with connected devices, through a more natural human-computer interface, called Ari (Augmented Reality Interface).

BT: Where did the idea for Ari come from?

TF: The first software we built for smart glasses (GhostRunner) was difficult to build and taught us that for lasting market success, smart glasses require a standardized infrastructure. Therefore, our core team is formed around a central vision: a platform that makes it easier for developers to build and deploy robust applications for smart glasses. The first piece of our platform we’re releasing is Ari.

BT: What’s the creation process for Ari been like? How long did it take to go from inception to the beta that’s launching now?

TF: We have been working on Ari itself for the past year. Creating a reliable, quick, and mobile gesture-recognition software for a monocular camera is very difficult. Over time, we have created a large database and processes that allow us to quickly create new gestures for clients and developers.

BT: When do you foresee the beta period ending and the full launch to take place?

TF: We have no set date yet, however we are pushing for this fall to fall in line with the anticipated smart glass launches.

BT: What are some of the challenges facing the wearables category in general, and smart glasses specifically? What can be done to overcome them?

TF: Right now smart glasses are very basic with few apps and little infrastructure software, just like the original iPhone.  You can see the potential, but it’s just not there yet.  We are confident with our platform and the creativeness of developers, that smart glasses will create those killer applications that’s needed for mass adoption.

BT: Can you give us some specific examples of the ways that Ari can be used in terms of practical applications with smart glasses?

TF: The first major adoption that we are seeing from our platform is from the industrial, medical, athletic, and AR [augmented reality] markets. Some examples are an assembly line for the industrial market, in the operating room when your hands need to be sterile, in the athletic market of snowboarding/skiing (when gloves are on), or cooking and you are able to turn the pages of the recipe, etc.  The list is actually endless when we are creating a platform that allows others to build on and create custom gestures for their applications.

[OnTheGo Platforms]