
Re-imagining the future of productivity and spatial computing.
As a Product Designer on the Spacetop Device and OS teams, I served as the primary point of contact between design, product, and engineering for several key features. I worked directly with the hardware and software engineering teams — a collaboration that shaped the more device-level work I took on.
2022-2024
Device user experience
Human interface design
Canvas interactions
AR / Mixed Reality
Prototyping
Research
[1]
Out-of-box experience
Spacetop was a screenless laptop paired with AR glasses — a genuine new category in computing. There was no familiar UI to anchor expectations. No screen. No existing mental model to borrow.
The main questions our design guild tackled were: how does the device communicate with a user when the glasses aren't even on yet? And what's the underlying logic that governs every transition the OS makes — from boot to idle, from update to ready?
Early testers asked: is it like a VR headset? Does it track my eyes? Where's the display?
user pain points
Through early usability testing, we identified three key friction points that were breaking the onboarding experience: physical discomfort from headset fit, confusion around spatial orientation, and hesitation when interacting with the virtual workspace.
By mapping user pain points into flow diagrams, we prioritized three learning modules: glasses fitting, canvas distance, and workspace placement, to address confusion before it could escalate into frustration.
ONBOARDING: WIZARD Anatomy
During an on-rails experience, the card anatomy would be as minimalist as possible as to not overwhelm. User controls remained in place to give the feeling of agency: the experience could be skipped at any moment.
A carousel at the bottom of the card would provide insights on the process steps, we tried to reassure this would be a short experience with only the necessary information highlighted to get started using Spacetop.
[2]
Indicator state machine
Beyond the Screen — Hardware Feedback
During boot and software updates, the AR glasses weren't on. An OS update could take several minutes. How do you communicate system state to a user with no screen available?
I designed a two-channel feedback system: E-Ink Mini-Display for visual feedback and LED lighting to communicate boot states and device status. This later became the seedling for the device's Interaction State Machine.
How was it validated? Paper prototypes for iconography in the mini-display; later on pushes from my computer's terminal to the device via APK packages.. Timers simulating real boot times. Repeated cycles to make sure users felt safe leaving the device unattended. I worked directly with the embedded hardware team to ensure the LED animations were technically feasible and matched the spec.
Possible states
I started by breaking down all the different states the device could be in, and how those would look and feel to the user. Along with the engineering team, we reached the following system possibilities:
-
Booting
-
Restarting
-
Shutting down
-
Battery - Low
-
Battery - Charging
-
Battery - Full
-
Sleep mode
-
Idle
-
System-level errors or critical failures
I sketched user flows to cover each of these moments and worked through how the system would transition between them — both with and without the AR glasses on. The indicator state machine also considered the available layers of system functionality to decide which states would overtake other states. This enabled the hardware teams and the production factory to build the state machine having considered the user's experience first and foremost.
[3]
insights
Working at Sightful challenged me to become a medium between the product team’s KPIs, and the deep technological challenges of delivering new ideas and experiences to the engineering teams, both software and hardware. All the while maintaining my role as a design advocate for a polished, clear and delightful experience for our users.
The tools that made this bridging possible were my use of Figma prototypes, creative use of AR in my normal work laptop, After Effects animations, as well as running custom-made APKs between my laptop’s terminal to a reall Spacetop in a dev environment.
It pushed me to consider UX thinking and designing for experiences beyond screens, and beyond typical digital design conventions.
















