A GNN-based dynamics model combined with MCTS lets robots efficiently scoop granular objects from containers, bridging the sim-to-real gap by adapting simulation-trained models with minimal real-world data.
A dialogue-based robot integrates CLIP-powered visual-language grounding with autonomous navigation to assist visually impaired users, guiding them to desired landmarks and providing real-time environmental descriptions through natural, unconstrained speech interactions.
Projects & Experiences
Autonomous "Sentry" Robot for RoboMaster Competition
In this project, I developed the complete software stack for an custom omnidirectional robot using ROS and FreeRTOS, enabling smooth navigation across various environments through FAST-LIO SLAM and TEB planner with the aid of a MID360 3D LiDAR.
In this project, I led the development of a wheeled-legged balancing robot, designing a custom STM32F103 microcontroller board and embedded software, with LQR and VMC-based control for stable locomotion. Our project received the "Grainger Best Overall Project" award, the top out of 40 teams.
During my undergraduate, I led the embedded systems team for Illini RoboMaster, participating in six competitions across the US and China. Our team achieved 2nd place (top 10%) in both the 2022 and 2023 RoboMaster University League North America rounds. I had experiences developing the embedded software and perception modules for 5+ distinct robots.