SUB Cloud.JPG

Interaction & User Experience

User Experience Prototyping, Apple Inc.

User Experience Prototyping, Apple Inc.

I led R&D projects within the Industrial Design group at Apple, focusing on user experience, interaction design and the integration of bleeding-edge technologies.

 My team at Apple worked to develop the interface for the iPhone X and the HomePod, among many other projects. We were responsible for collaborating with the audio engineering teams and communicating the design intent of the HomePod’s physical interface, as well as the bluetooth pairing and music playback experience. For the iPhone X, we built the first full-screen prototypes to develop the new buttonless interface.

My team at Apple worked to develop the interface for the iPhone X and the HomePod, among many other projects. We were responsible for collaborating with the audio engineering teams and communicating the design intent of the HomePod’s physical interface, as well as the bluetooth pairing and music playback experience. For the iPhone X, we built the first full-screen prototypes to develop the new buttonless interface.

line.001.jpeg
CuddleBot

CuddleBot

I was the lead designer, engineer and code writer for the CuddleBot project, a therapeutic robot developed at the University of British Columbia. The CuddleBot integrated a full-body fabric touch sensor capable of reading individual hand gestures. It would respond to gestural inputs (touches) by ‘breathing’ and moving using an artificial skeleton with motorized joints and diaphragms.

 This project became an open-source developmental tool in 2014, and refinements to the technology and the sensor’s detection algorithms are ongoing. It has been deployed for studies in hospitals as a repeatable, predictable means of calming children in stressful situations.

This project became an open-source developmental tool in 2014, and refinements to the technology and the sensor’s detection algorithms are ongoing. It has been deployed for studies in hospitals as a repeatable, predictable means of calming children in stressful situations.

line.001.jpeg
Clouds and Light

Clouds and Light

Clouds and Light was an interactive art installation that I proposed as part of a call for submissions for the new Student Union Building at the University of British Columbia. It consisted of multiple light-diffusing mesh panels that could also be illuminated by imbedded LEDs. The panels could move in a synchronized, wave-like motion, activated by a series of pulleys which required active participation from passersby. The pulleys also turned a generator which stored energy in a battery. This battery would power the LEDs at night.

 The end result, as intended, would be an installation that constantly changed its appearance as the panels moved and the ambient light conditions changed.  I built a miniature, working version of the installation as a proof of concept. The project was subsequently short-listed along with 2 others for approval and funding, but the proposed full-scale installation was ultimately deemed too complex to install and maintain in the space.

The end result, as intended, would be an installation that constantly changed its appearance as the panels moved and the ambient light conditions changed.

I built a miniature, working version of the installation as a proof of concept. The project was subsequently short-listed along with 2 others for approval and funding, but the proposed full-scale installation was ultimately deemed too complex to install and maintain in the space.

line.001.jpeg
Wrist-worn Haptic Device

Wrist-worn Haptic Device

I designed, built and programmed a haptic wrist-worn device for interaction research at the CARIS robotics laboratory at the University of British Columbia. The device was intended as a simple interface and feedback loop which could accommodate many contexts of interaction; we began with the notion of creating a pure haptic feedback loop, wherein the tactility of one’s interaction with the device determined the resultant tactility of the feedback.

This device was later used for prototyping attention-based interactions wherein the user’s state of attention was monitored by a Galvanic Skin Response (GSR) sensor. In theory, GSR sensors can detect moments of distraction in the user by measuring the conductivity of the skin due to subtle changes in moisture levels. Haptic feedback was used to alert the user when the sensors detected these moments.

CARIS 2.jpg