I'LLBYTE

TEAM
  • Julia Daser
  • Pepi Ng
  • Beatrice Ribeiro
  • Eloise Yalovitser
TIMELINE
  • January 2023
  • 2.5 Days
ROLE
  • Art Direction
  • Physical Prototyper
  • XR Prototyper
SKILLS
  • Unity and Hardware Integration
  • Prototyping
  • Fabrication
Description
I'llByte is the prototype for a jaw haptic that allows users to bite into virtual objects.
Consisting of an adjustable jaw brace and a harness as well as a working demo, this hackathon submission won the Hardware Track and Technology Horizons for Human Interfaces awards at MIT Reality Hack '23.
Final Design
Takeaways
1. Having a Set Audience
When initially pitching this project to the many judges and mentors at the hackathon, we proposed a lot of use cases- this could be used for entertainment, physical therapy, and sustainability.
A key piece of feedback we received after the judging, was that our design solution wasn't catering to a specific audience, making it a bit vague.  
2. XR is a really amazing field!
This was my first hackathon and first fleshed out XR prototype. Though we were team of all female designers, we were able to bring our out of the box thinking and create something truly fascinating.
The many companies, mentors, and friends I've made as a result of this hackathon made it a truly valuable experience.
Problem
The current state of VR technology focuses on creating an immersive experience through sight, hearing and taste.
However, as of today, there has not been a device that stimulates the jaw, even though eating and food are such integral parts of the human experience.
With such an obvious gap in the VR industry, we decided to take on the challenge.
Sketches
Solution
We created a futuristic jaw haptic device that consists of two parts- an adjustable jaw brace and a harness. The jaw brace is made of aluminum wire and rubber bands, and can be adjusted to fit the facial profile of different users. The jaw brace is then attached to the harness via springs. The tension on the springs can be adjusted via motors.

This means that when a user “eats” a particular food item in the XR experience, depending on the texture of the food item, the tension in the springs are adjusted automatically. This simulates the extent of difficulty or ease in chewing different items in an XR experience.
Challenges
Building teamwork skills
The biggest challenge however, was the short amount of time that we had. Working with a random group, choosing an idea we could all agree on, and being able to stay calm despite setbacks were skills that were really tested during this event.

So, I'm really thankful for my team, and I'm really proud of us!
In general, haptics are really hard to develop!
One of the main requirements that we had was to use MIT's The Singularity, an SDK that makes it easier to connect the ESP32 to Unity. Most of our time was spent on testing The Singularity SDK as well, but luckily, we were the first team in the hardware track to successfully test it out.
Source Code
Though we're not developing this project further, feel free to iterate on the source code and ask us any questions!

Back to Projects