I build real-world robotic systems that combine perception, intelligence, and action.
- π€ Working on Unitree GO2, Franka Robots & Autonomous Systems
- π§ Exploring Vision-Language-Action (VLA) / Embodied AI
- ποΈ Strong in Computer Vision, Sensor Fusion & Real-Time Systems
- βοΈ Love building end-to-end pipelines (Perception β Decision β Control)
- π Experience in industrial + client-facing robotics deployments (EY, Experience Centres)
- Build autonomous navigation systems using LiDAR + Depth cameras
- Develop real-time perception pipelines (YOLO, AprilTag, SLAM)
- Integrate multi-sensor robotics systems (LiDAR + Thermal + RGB-D)
- Work on robot learning & VLA systems (Moondream2 + Robotics SDKs)
- Deploy production-ready robotics solutions, not just simulations
- Multi-sensor fusion (LiDAR + RealSense)
- Navigation + inspection pipeline
- Working toward full autonomy
- Real-time pipeline: perception β reasoning β action
- Integrated with Moondream2 + Unitree SDK
- Robotic manipulation + decision-making
- Bridging perception with control
- Distributed system (Desktop + NUC + Robot)
- ZMQ communication + real-time execution
- YOLOv8 + RealSense + Jetson Orin Nano
- Sends alerts for unknown persons
- Real-world manufacturing solution
- Focus on deployment & reliability
- ROS2, Unitree SDK, RealSense, LiDAR (Hesai XT16)
- ZMQ, NATS (event-driven systems)
- OpenCV, PyTorch, YOLOv8
- AprilTag Detection
- SLAM, Sensor Fusion
- Vision-Language-Action (VLA)
- Embodied AI Systems
- Robot Learning (Exploring)
- Python, C++, Linux (Ubuntu)
- Jetson, Edge AI Systems
- π Building real-time embodied AI systems
- π€ Scaling robotics from demos β real deployments
- π§ Learning deeper control systems & robot learning
- π· Instagram: https://instagram.com/it5meyash
I donβt just train models β
I make them move real robots in the real world.
