Wearable Concept 2

Wearable Concept 2 (WC2) is the latest prototype from PLT Labs. Built on the same great platform as Wearable Concept 1, but enhanced with new capabilities for developers to explore. Sign up to stay informed.

Concept 1

Our Concept Series prototypes are packed with sensors. Soon we’ll be exposing the APIs that surface these sensors, so you can create new applications and user experiences. Please view the below video for an overview what is included in this exciting wearable device.
Sign up to stay informed.


Application 1: Head Tracking

Our first Concept Series prototype has the ability to track the wearer’s head movements.

Watch the Video Demo

How Head Tracking Works

Using a low-latency, nine-axis sensor, the prototype tracks head orientation in three dimensions in space: heading, pitch, and roll – in other words, the X, Y, and Z axes.

Application 2: Controlling Multitrack Audio

Building on the head-tracking application, this application uses head position to control which speaker plays which music track.

Controlling Multitrack Audio

Application 3: Automating Multicamera Environments

This application demonstrates how the headsets can detect which camera the user is looking at. And, using a Mac, it automatically switches between cameras, based on this information.

Automating Multicamera Environments

Potential Applications

In the spirit of inspiring creativity, here are some potential applications for wearable technology from PLT Labs:

Guess where I am.

Augmented Experiences
(Using Google Street View)

Here’s something to get you thinking about applications. We’ve actually combined the head tracking, gyro, and compass capabilities and mashed those up with Google Street View. Based on the GPS coordinates from a smartphone, the system can show where you’re standing and what you’re looking at, rendering this locally on your device in real time as location or head orientation changes. So tourists anywhere in the world could know where they are in relation to unfamiliar surroundings - and they could show friends and family back home just how far they’ve come in the world.

Potential fitness applications


Potential fitness applications could take advantage of Concept 1’s onboard pedometer functionality, free-fall detection, and music (A2DP) streaming capabilities.

Enhanced Gaming

Heading, pitch, and roll data could register a head nod or shake. Tracked movements could potentially be used in conjunction with mobile controllers to navigate game environments more intuitively. Head tracking is just one more input that game designers could deploy to help create more nuanced user experiences.

Real Estate
(Using Google Street View)

Agents could potentially show clients properties via mobile phones or tablets. Or conversely, clients could show agents properties or neighborhoods that interest them.

First Response
(Using Google Street View)

Base personnel, for example, could direct first responders at night or in smoky environments via head tracking, gyro, and compass capabilities. In turn, first responders could relay location and status of emergency events back to the base.

Custom Gestures

Applications could be developed that would enable a person to answer or decline calls with custom gesture combinations. For example, customer service reps could initiate automatic responses with simple head gestures.

What’s Inside the Concept 1 Wearable

  • Real-Time device orientation data stream
  • Free-fall detection
  • Pedometer
  • Tap detection
  • Yes/No gesture recognition
  • Support for simultaneous voice and data
  • Wear state (on/off)
  • Proximity (near/far) to host devices
  • Bluetooth (HFP and A2DP)
  • Onboard MFi (Made For iOS) chip for data transmission over Bluetooth with iOS devices

Getting Access to What’s Inside

  • Developer APIs for major mobile and desktop operating systems
  • Documented code samples and example applications

This is just the beginning. Have some ideas?

Let’s get the conversation started on Twitter @PLTlabs #PLTlabs