Demonstrations
The NAIL Lab focuses on developing resilient and sustainable intelligent networked autonomous systems. Our demonstrations showcase real-world applications of our research in multi-agent systems, learning-based control, human-autonomy teaming, and safety-critical systems.
📚 Research Projects & Publications
Video demonstrations accompanying our published research papers and ongoing research projects.
Perception-Aware Leader-Follower Control with CBFs | ICRA 2025 (Expriment Video)
Two ROSbot Pro 2 robots demonstrate perception-aware leader-follower control using Control Barrier Functions (CBFs) to maintain visual contact. Our approach reliably keeps the leader within the follower's field of view, while conventional formation control without CBFs often loses visual contact, leading to unsafe behavior.
Distributed Leader-Follower Formation Control with Vision Constraints | ICRA 2025 (Supplementary Video)
A perception-aware distributed formation control scheme for agents using body-fixed cameras with limited field of view (FOV). Our CBF-based approach ensures leader visibility by incorporating FOV constraints, while neural network and double bounding box estimators provide robust state estimation from real-time image data across various environments.
FACETS: Efficient Once-for-all Object Detection via Constrained Iterative Search | ICRA 2025 (Late Breaking Session)
Our FACETS-derived model running on the MAX78000 microcontroller demonstrates efficient object detection within tight hardware constraints (432 KB memory, 32-layer limit). Despite these limitations, it achieves 45.4% less energy consumption, 29.3% lower latency, and 4.5% higher mAP compared to Analog Devices' TinierSSD baseline.
Toward Embedded LLM-Guided Navigation and Object Detection for Aerial Robots | ICRA 2025 (Late Breaking Session)
A hierarchical framework integrating natural language commands with autonomous quadrotor navigation using the ModalAI Seeker drone. Our fine-tuned LLaMA model (via LoRA) interprets high-level instructions into task goals, while the drone executes them through onboard VIO-based control, path planning, and real-time object detection with hardware-in-the-loop testing.
🎓 Capstone Research Projects
Undergraduate and graduate student project demonstrations showcasing innovative applications and learning outcomes.
UH Students Build Autonomous Drone with Real-Time Object Detection | NAIL Lab Capstone Project
Computer Engineering Technology undergraduates designed, assembled, and programmed this custom drone from scratch as their senior capstone project. Working under NAIL Lab guidance, the drone features autonomous navigation using OptiTrack motion capture and real-time onboard object detection—demonstrating hands-on robotics and AI development by UH students.
🔬 Lab Equipment Demonstrations
Demonstrations showcasing our AI-driven heterogenous robotic platforms and their autonomous capabilities in real-world scenarios.
Autonomous UAV Landing on Moving UGV | Multi-Robot Collaboration
Demonstration of advanced multi-robot coordination where an autonomous UAV successfully lands on a moving ground vehicle, showcasing precise real-time control and inter-robot communication.
ModalAI Seeker SLAM Drone Demo: Mapping, Motion Planning & Object Detection
Live demonstration of our ModalAI Seeker drone performing simultaneous localization and mapping (SLAM), autonomous motion planning, and real-time object detection in complex environments.
Fully Autonomous Jackal Robot Demo | SLAM, Obstacle Avoidance & Object Detection
Complete autonomous navigation demonstration featuring our Jackal robot performing SLAM, dynamic obstacle avoidance, and object detection in both indoor and outdoor environments.
For more information about our research capabilities or potential collaborations, please contact us.