research
My research focuses on leveraging a synergistic combination of Hardware
and AI
to enable user-independent, data-efficient recognition of diverse time-series signals, including Force, IMU, and EMG from body joints and muscles.
The goal is to develop, energy-efficient wearable devices that enhance human-computer interaction, leveraging high-quality sensor datasets and adaptable AI models
for user-agnostic
, data-efficient
performance
Task/User-Agnostic Wearable Human-Computer Interface
This research demonstrates advanced gestural interface capabilities enabled by high-quality datasets collected from newly developed wearable sensors. It features co-developed AI models that adapt to multiple users and tasks with limited training data.
The model leverages Transformer-based Few-shot learning for multi-task interaction, showcasing keyboard-less virtual typing and object/gesture recognition. Nature Electronics, 2023

AI-Augmented Wristband for Gesture Recognition

This research introduces a wrist-mounted single sensor that captures subtle skin deformations caused by finger movements, inspired by visible ligament shifts on the wrist.
The analog signal output enables low-latency processing, with an LSTM-based model predicting both finger identity and bending angle. The system was designed for adaptability across users using transfer learning and fine-tuning with minimal new data.
Nature Communications, 2021