I’m interested in building embodied intelligent systems that can perceive, learn, and act collaboratively with humans.
My research integrates:
- 🧠 Reinforcement Learning & Imitation Learning for adaptive decision-making and skill transfer
- 🤖 Robotics & Embodied AI to teach robots how to learn from demonstrations and interact with the physical world
- 👁️ Computer Vision & NLP for multimodal perception and natural communication between humans and machines
- 🎓 Research Intern at LIRIS Lab – École Centrale de Lyon
- Working on behavior cloning for robotic manipulation using LeRobot
- Exploring ACT, π₀, and Diffusion-based policies for generalization
- Studying multi-task vs. single-task policy learning and transferability
- Embodied & Interactive AI
- Reinforcement and Imitation Learning
- Robotic Perception and Control
- Vision-Language Models (VLMs)
- Human-Centric and Responsible AI
- 🦾 Robotic Manipulation via Behavior Cloning — imitation-based learning using LeRobot
- 🦷 3D Smart Factory — 3D teeth segmentation using PointNet
- 🌾 Flahti — RAG-based agricultural assistant powered by Mistral-7B
- ⚖️ Legal Assistant — RAG chatbot providing advice on Moroccan law
- 📝 Arabic OCR — CRNN model for handwritten text recognition
- 🌿 PlantTech — deep learning app for plant disease detection
I'm always open to discussions about:
- Reinforcement Learning
- Generative AI / Large Language Models
- Robotics & Computer Vision
- NLP and Arabic AI
- Applied Machine Learning & Deployment
📫 Reach me at: al.mh.mohamed@gmail.com
⭐️ Inspired by real-world AI impact — Let's innovate together!