Part of the “Who.Where.What?” feature we initially launched at CES 2020 was voice commands for shopping the scene. This video gives a sense of what is possible with these commands. A lot of care was taken in curating a list […]
I worked on this project with Caroline (she calls it Nudge, see more on her site). Essentially, it fetches Google calendar data and non-imposingly/slowly lights up a notification light on your watch. Check out my code on Github. My role:
The Wilmington Robotic Exoskeleton (WREX) is an assistive device made of hinged metal bars and resistance bands. It enables kids with underdeveloped arms to play, feed themselves and hug. For our biomechanics class, we attempted to replicate one from open
This prototype was meant to simulate the projected strings between tables and what happens when you “pluck” them (walk across). The movement and sound were easier to adjust with this prototype. Made with openFrameworks.
We really wanted to redefine the way patients interact with their doctors and their healthcare in general. We did a good amount of iterating on novel concept designs for the patient interface. Coming from a more abstract perspective, we tried
This is a prototype for a dashboard that visually represents a person’s health with color, movement, and sound (not captured in this video). It utilized Spacebrew for some of the background message routing. My role: programmer, interface designer Worked with:
A device that relays a dog’s emotions in LEDs: here only detecting a wagging tail. The goal here was to create a soft circuit that detects when a dog is wagging its tail – which then activates LEDs in a