Drafter

Screen Shot 2017-12-27 at 9.25.02 PM.png

Skills:
Particle Photon
Physical Computing
Prototyping

all 2017

Designers:
Myself
Hardik Patel
Maayan Albert

Engineers:
Jacob Johnson
John Moosbrugger

 

Drafter:
Designing Interactions and Feedback for 3D Modeling in Augmented Reality

he Drafter project aims to develop a 3-D modeling tool in Augmented Reality using natural hand-gesture based inputs. At its core, the project looks at how we can build intuitive design tools for Augmented Reality.

 

 
 

Context

I joined the Drafter team particularly interested in the ways we can ground Augmented Reality through tangible experiences. In many ways, the act of manipulating in AR space lacks the distinction and richness we feel when we manipulate things in reality.

By combining not just visual feedback, but also sensations with touch, the feedback we feel in AR can become even more rich and intuitive in use.

Goal

To build upon the interaction designs of Drafter and prototype a complete mobile and working model for tangible interactions in AR.

 
 

 
 

 

Overview

The design is focused on table-top UI for easier ergonomics and ease of use. The user could be modeling in AR for extended periods of time– making working increasingly tiring if hands were constantly raised. One aspect I loved about the table-top was that the table surface becomes the feedback response. Similar to the hololens' airtap gesture, the physical reality of the user allows for feedback that would otherwise need to be programmed.  

In this design, we've taken a similar approach to the paradigm of computer-based CAD UI- which fit well with the surface experience of the table. However, I think the space of how we can engage these interactions in different modes (like grabbing, or toggling, etc) is a project in-and-of itself worth exploring.

— The hololens airtap gesture is a personally an inspiring interaction. There's a lot of value in this simple, clever use of allowing your own body to become your feedback.

 

This project is broken down into two main segments:

1. Spatial UI
2. Haptic Development

While I worked on both, the majority of my time was dedicated to developing a working haptic model. 

 

 

Spatial UI

Driven by the desire to create a demo, the team set out to first define the features necessary to create a table. 

For the feature designs of the tabletop UI, we wanted to first step into what is common-place, in the hopes of creating what would be an intermediate step to designing for intuitive use of AR.

 
Untitled_Artwork 22.jpg

Demo Process

By looking at the needs of the demo, we were able to solidify and simplify what we needed to build. This became the foundation of our work as we proceeded through the layout, interactions, and design.

 
 

in terms of requirements, printing on 11x17 tabloid paper actually brought us to a close scale of the final prototype, as all the features needed to be near the user at arm's width.

 

 

Haptic Feedback

Gloves were a clear choice in consideration of what form the haptic experience would take on. To integrate with Leap Motion, we needed to maintain the freedom of the hand, and bring the experience as physically close to the person as possible.   

Our team created a circuit, a framework for how the glove should feel on the hand, and then dove into the various methods to make it come to life. 

 

The Goal —

The goal of creating the haptics is to provide a richer and grounded experience to interacting in AR. It was important to us that the haptics felt smooth- and did not become a hindrance when interacting with the program. That the technology itself was invisible, and the experience was highlighted. 

The original prototype of the glove was a true hackfest- held together by adhesives, with the electronics out in the open. Our challenge would be to integrate everything so the experience felt seamless.

The Process –

Electronics in a wearable is hard, and electronics in gloves wasn't the easiest thing to do as we learned throughout this process. The majority of our process scaled from having a working circuit, then integrating the experience into a well executed glove. This meant experimenting with materials and processes.

 

Looking Wide; Alternative Technologies

Originally, Hardik, Maayan, and I didn't want to narrow our choices straight using haptic motors. Instead, we drew inspiration from research such as teslatouch and looked at the potentials of using other technology- like inflatable surfaces or electrodes.

It was a huge inspiration for me. It bridged well into how we can manifest physical experiences to enrich digital interactions. This is done through the use of electric signaling to the hand.

 
 

Prototyping the Gloves

 

In the end, we still landed on using haptic motors for their efficiency and reliability. While the other ideas were strong, they did not align with the project's timeline and were harder to control.

Our team decided to scour the internet for a decent glove to hack through- as sewing gloves was a time consuming effort that did not help us advance through prototyping the experience of the gloves. We chose this one.

components include per hand: particle photon, resistor, diode, capacitor, transistor, haptic motors (three), and wiring.

Ideally, we wanted to use conductive fabric for the main circuit in order to hide the bulky sensation of wires. We prototyped patterns for the circuit with copper tape, then translated them to fabric for sewing.

 

But, it didn't work! Unfortunately, after prototyping and testing, we found the conductive fabric to be unreliable and held poorly to use. We went back to prototyping with wires as a robust method, and worked to integrate all wires and pieces into the glove to feel fluid. 

 

What we were actually able to achieve was a glove that hid all electronics completely, with extra layers hidden underneath. The components were buffered by a layer of fabric, allowing the glove to feel like nothing was there, and worked fully with the program.