A structured roadmap from absolute beginner fundamentals in Python and mathematics, through machine learning and deep learning, to building and deploying advanced agentic AI systems.
before we dive in,iam Romani Nasrat— AI Engineer @ Penny Software, been freelancing since 2023. See full bio below
💡so this isn’t just a normal roadmap — I’m sharing my actual personal learning journey, from college all the way to professional AI work. Every resource here is personally tested and verified based on what I studied and used to reach where I am today. You’ll find both theoretical and practical topics in mathematics, computer science, and AI, along with carefully chosen learning resources. Everything is organized step-by-step, so just keep learning and don’t worry. This roadmap represents a 4-year journey, including 2 years of professional experience freelancing and full-time work, so don’t rush. Take your time, okay? You won’t become great in two months — and that’s totally fine.
let me say it in Arabic-Egyptian 😊
💡 الرود ماب دي مش مجرد رود ماب عادية — أنا بشارك رحلتي الشخصية الحقيقية في التعلم، من أيام الكلية لحد الشغل في مجال الـ AI بشكل احترافي. كل مصدر هنا أنا جربته وذاكرته بنفسي، بناءً على اللي اتعلمته واستخدمته عشان أوصل للمستوى اللي أنا فيه دلوقتي. هتلاقي فيها توبيكس نظرية وعملية في الرياضيات وعلوم الحاسب والـ AI، مع مصادر تعليمية مختارة بعناية. كل حاجة منظمة خطوة بخطوة، الرود ماب دي بتمثل رحلة 4 سنين، منهم سنتين خبرة عملية فريلانسنج وشغل فل تايم، فما تستعجلش. خد وقتك، تمام؟ مش هتبقى محترف في شهرين ودي حاجة عادية تماماً.
🎯 Who is this for?
- Beginners starting from scratch with Python and math
- Students transitioning into AI/ML careers
- Developers who want to build AI models, deploy systems, or create intelligent AI agents
- Anyone looking for a clear, structured path from fundamentals all the way to production-ready
📚 Table of Contents - Click to expand/collapse
- 🚀 The Complete AI Engineer Roadmap 2026
- Agentic & Generative AI: From Zero to Deployment
- 📋 About This Roadmap
- Phase 0 – Foundations
- Phase 1 – Introduction to AI
- Phase 2 – AI & Machine Learning & Deep Learning & Data Science fundamentals
- Phase 3 – Agentic AI Systems
- 📂 Additional Resources
- 🤝 Contributing
- ⭐ Show Your Support
- 👨💻 About Me
- 📬 Contact & Connect
Note: this is just a starting point. For me, it came after two years of college study and before I began learning AI, so I already had a background in computer science and programming. However, if you’re a complete beginner, you can start here as well by learning Python from scratch.
To build robust AI agents, you need more than just "scripts". You need an engineering-first approach to Python. in this phase you will learn the fundamentals of Python programming, including data structures, OOP, and modern Python features that are essential for developing AI systems. This will give you the tools to write clean, efficient, and maintainable code for your AI agents.
🔧 1. Foundations & Core Mastery
- Absolute Basics: Variables, Data Types (
int,float,str,bool), and Basic Operators. - Functions: Functions, args, kwargs, parameters, return values, and scope.
- Syntax & Logic: Loops, list comprehensions, and Python 3.10+
matchstatements. - Data Structures: Deep dive into
list,dict,set,tuple, pluscollections.dequeanddataclasses. - Functional Tools:
lambda,map,filter, and theitertoolsmodule. - Exception Handling: Try/Except/Finally, custom exceptions, and context-aware error handling.
🏗️ 2. Advanced OOP & Design Patterns
- OOP 4 pillars: Inheritance, Abstraction, Encapsulation, Polymorphism.
- OOP Fundamentals: Mixins (critical for framework customization), and Composition.
- Magic Methods: Understanding
__init__,__call__,__repr__, and__getattr__. - Decorators: Building custom decorators loic (logging, timing, and agent retry logic,etc).
- Context Managers: Resource management using
withstatements andcontextlib.
⚡ 3. Modern Python (The "Agentic" Stack)
- Type Hinting: Using
typing(Annotated, Optional, Union) for robust, self-documenting code. Python Typing Docs - Pydantic V2: Data validation and structured outputs—the backbone of LLM communication.
- Asyncio:
async/await, event loops, and concurrent task execution for responsive agents. - Testing: Writing unit and integration tests with
pytestto ensure agent reliability. - Threading & Multiprocessing: Understanding the differences between threading and multiprocessing and when to use each.
- Generators & Iterators: Understanding how to use generators and iterators to process large datasets.
🌐 4. Connectivity & Data Extraction
- HTTP Clients:
requestsandhttpx(for async) to interact with external tools and APIs. - Data Serialization: Handling
JSON,YAML, andMarkdownprogrammatically. - Web Scraping: Utilizing
BeautifulSoup4andPlaywright/Seleniumfor browser automation.
📦 5. Environment & Package Management
- High-Performance Tooling: Using
uvfor ultra-fast dependency management. - Standard Tooling:
pip,venv, andpyproject.tomlconfigurations. - Observability: Implementing structured
loggingto trace agent reasoning steps.
📚 6. Resources
this books what i used to learn python- Book: Python Crash Course
- Book: Python All in one for dummies
- Docs: [Python Docs](https://www.python.org/doc/)
- Docs: [The Hitchhiker’s Guide to Python](http://docs.python-guide.org/en/latest/)
- Docs: [Python 3 Quick Reference](https://pewscorner.github.io/programming/python3_quick_ref.html)
-
Ned Batchelder: Facts and Myths about Python Names and Values
-
Ned Batchelder: Loop Like a Native - While, For, Iterators, Generators
-
Mark Smith: More Than You Ever Wanted to Know About Python Functions (EuroPython 2018)
-
Regular Expressions (Regex) Tutorial: How to Match Any Pattern of Text
-
Brandon Rhodes: All Your Ducks in a Row - Data Structures in the Std Lib and Beyond (PyCon 2014)
- Python’s super() considered super!
- Context Managers the Easy Way
- Python dataclasses will save you HOURS, also featuring attrs
- Floating Point Arithmetic: Issues and Limitations
- difference-between-getattr-and-getattribute
- What is the difference between venv, pyvenv, pyenv, virtualenv, virtualenvwrapper, pipenv, etc?
Extra resources for i studied during learning basic python
intermediate python
advanced python
You don't need a PhD, but you need to understand the mechanics.
📐 Mathematical Foundations
AI is fundamentally built on math and statistics, so having a strong understanding of these foundations is essential to becoming a good AI engineer. If you're looking for a great place to learn, I highly recommend the Arabic YouTube channel of my college professor, Dr. Ahmed Hagag. He’s excellent at explaining both academic and practical concepts in a simple way. Honestly, he’s one of those rare teachers who truly leaves a lasting impact on his students.- Discrete Math, Dr Ahmed Hagag
- Linear Algebra, Dr Ahmed Hagag
- Calculus | Math. (1) DrAhmed Hagag
- Differential Equations, Eng. Yousef Elbaroudy
- Statistical Analysis
- Statistics & Probability Dr.Ahmed Hagag
- Book: Discrete Mathematicsand Its Applications SEVENTH EDITION byKenneth H. Rosen (our college reference book)
the goal of this phase is to give a high-level understanding of what AI is and why it matters, and to introduce the concept of AI agents. This will provide the necessary context and motivation for diving deeper into the technical aspects of AI in the subsequent phases.
- What is AI?
- What is an AI agent?
- Thinking Humanly (The Cognitive Modelling approach)
- Acting Humanly (The Turing Test approach)
- Thinking Rationally (The Laws-of-Thought approach)
- Acting Rationally (The Rational Agent approach)
- Why is AI important?
- applications of AI in various industries (healthcare, finance, transportation, etc.)
the goal of this phase is to introduce the concept of search algorithms and how they are used in AI for problem solving and planning. This will give you a solid foundation in the fundamental techniques that underlie many AI systems, including agentic AI.
🔍 Search Strategies
- Search Problem Components (State, Actions, Transition Model, Goal Test, Path Cost) - Search Strategies: - Uninformed (Blind) Search: - Depth-First Search (DFS) - Breadth-First Search (BFS) - Uniform Cost Search (UCS) - Iterative Deepening Search - Bidirectional Search - Informed (Heuristic) Search: - A* Search - Greedy Best-First Search - Adversarial Search (Game Search) - Minimax Algorithm - Alpha-Beta Pruning🧬 Genetic Algorithms GAs
- Definition
- Main Motivation for Heuristic Techniques (Including GAs)
- GA Overview and Principle
- Stochastic Operators / Evolutionary Cycle Steps
- Initialization
- Parent Selection
- Recombination (Crossover)
- Mutation
- Survivor Selection (Replacement)
- Evaluation (Fitness Function)
🧠 Knowledge and Experts
- Knowledge
- Experts
- Rules as Knowledge Representation
- Expert Systems
- Components of a Rule-Based Expert System
- Expert Systems vs. Conventional Programs
Resources:
- Search Algorithms, Eng. Yousef Elbaroudy
- AI Search Algorithms In Python
- CS361 Intro. to AI [in Arabic], Dr.Amr S. Ghoneim
- Artificial Intelligence: A Modern Approach (3rd Edition) (our college reference book)
- BFCAI Artificial Intelligence
This phase provides a clear overview of the core AI fields and branches, and introduces the key techniques and algorithms used across the discipline — including machine learning, deep learning, reinforcement learning, and data science. Together these topics form a solid foundation for understanding how AI agents work and for designing and building them.
In this section, you will learn the fundamentals of machine learning, including core concepts, common algorithms, model evaluation, and optimization techniques used to build reliable predictive models.
🧠 1. Core Concepts
- Foundations: Supervised vs Unsupervised learning, train/validation/test split, bias–variance tradeoff, overfitting vs underfitting.
- Supervised Learning: Linear & Logistic Regression, KNN, SVM, Naive Bayes, Decision Trees, Random Forests, XGBoost, LDA/QDA.
- Unsupervised Learning: K-Means, DBSCAN, Hierarchical Clustering, PCA, ICA, Anomaly Detection.
- Model Evaluation: Accuracy, Precision/Recall, F1-score, ROC-AUC, MSE/MAE, Confusion Matrix, Cross-Validation.
- Model Improvement: Feature engineering, regularization, hyperparameter tuning (Grid/Random Search), handling imbalanced data.
- Ensemble Methods: Bagging, Boosting, Stacking.
- Reinforcement Learning: Q-Learning, DQN, Policy Gradients, Actor-Critic.
- Tools & Libraries: Python, NumPy, Pandas, Scikit-learn, Matplotlib/Seaborn(visualization), Jupyter.
📚 2. Resources
These are the resources I personally used to learn Machine Learning, and I highly recommend them
I was kinda a clever student 😂 — I mostly studied from my college lectures and followed the course sections step-by-step, then added Andrew Ng’s specialization + random helpful articles and videos.
So there isn’t one fixed source… learning ML is more like exploring from multiple places.
- Machine Learning Specialization (3 courses) — Andrew Ng (Coursera)
- Kaggle Profile (ML projects and notebooks)
- Clustering Algorithms — Google Developers
- Advanced Machine Learning, Dr Ahmed Yousry (in Arabic)
- What is Machine Learning?
- Random Forests Explained
- K-Means Clustering
- Logistic Regression
- How ML Algorithms Work
- AI/Ml Martials (all topics vodeos with source code examples)
- Book: Machine Learning for dummies
- Book: Machine Learning Yearning (Technical Strategy for AI Engineers, In the Era of Deep Learning) by Andrew Ng
- Book: Machine Learning with R second edition by Brett Lantz ( our college reference book)
- DeepLearning.AI Al courses slides and resources
In this section, you will learn how neural networks work from the ground up, then move to advanced deep learning techniques used in real-world AI systems.
🔹 Neural Networks — Basics
Learn the core building blocks of deep learning:
- what is a neural network and how it works
- Perceptron, neurons, layers (input / hidden / output)
- Activation functions (ReLU, Sigmoid, Tanh, Softmax)
- Forward & backward propagation
- Loss functions
- Gradient descent
- Backpropagation fundamentals
- Overfitting vs underfitting basics Goal: Understand how a neural network learns mathematically before using frameworks.
🚀 Deep Learning — Advanced & Practical
Apply neural networks to real-world problems and large-scale models:
Training & Optimization
- Epochs, batch size, iterations
- Gradient descent variants (SGD, Momentum, Nesterov)
- Optimizers: SGD, Adam, RMSProp, AdaGrad
- Regularization: L1/L2, Dropout
- Batch normalization, Layer normalization
- Learning rate schedules, Early stopping
Architectures & Models
- Computer Vision: CNNs, ResNet, EfficientNet, VGG, MobileNet
- Sequential / Time Series: RNNs, LSTM, GRU, BiLSTM
- Attention & Transformers: Transformers, BERT, GPT, T5
- Generative Models: Autoencoders, Variational Autoencoders (VAE), GANs, Diffusion Models
- Graph Neural Networks (GNNs)
Practical Skills
- Model building, training, evaluation, debugging
- Model saving/loading (checkpoints, serialization)
- Fine-tuning pretrained models
- Transfer learning
- Multi-GPU / TPU training
- Deployment basics (ONNX, TorchScript)
Frameworks & Tools
- TensorFlow / Keras
- PyTorch
- Hugging Face Transformers
- CUDA & GPU acceleration
📚 Resources
- Neural Networks and Deep Learning By Andrew Ng
- Artificial Neural Network Ar
- Deep Neural Network using Keras Ar
- what is a neural network?, Dr Ahmed Yousri Ar
- geeksforgeeks
- Deep Learning , Computer Vision, NLP Hessham Assem
- AI/Ml Martials (all topics vodeos with source code examples
- Deep Learning specializations Deeplearning Ai
- DeepLearning.AI Al courses slides and resources
Natural Language Processing (NLP) is the field that focuses on enabling machines to understand, interpret, generate, and interact using human language. This section summarizes core concepts, tasks, models, tooling, evaluation, and practical considerations for building NLP systems.
🔹 Core Concepts
- Linguistic Levels: Morphology, Syntax (POS, parsing), Semantics, Pragmatics, Discourse.
- Representation: Tokens, subwords (BPE/WordPiece/SentencePiece), characters.
- Classical Methods: Bag-of-words, n-grams, TF-IDF.
- Embeddings: word2vec, GloVe, fastText, contextual embeddings (ELMo, BERT-style).
- Text Cleaning: lowercasing, normalization, stopword removal, stemming/lemmatization, de-duplication, cleaning noisy text.
🔧 NLP Tasks
- Text Classification: Sentiment analysis, intent detection.
- Sequence Tagging: POS tagging, Named Entity Recognition (NER).
- Parsing & Syntax: Dependency & constituency parsing.
- Coreference & Anaphora Resolution.
- Information Extraction: Slot filling, relation extraction.
- Question Answering (QA): Extractive, abstractive, open-domain.
- Machine Translation (MT).
- Summarization: Extractive and abstractive.
- Dialogue Systems & Conversational AI.
- Text Generation & Controlled Generation.
🧠 Models & Architectures
- Sequence Models: RNN, LSTM, GRU, seq2seq with attention.
- Transformers: modern replacement for many sequence models; see the dedicated Foundations: Transformers & LLMs section for transformer specifics (self-attention, encoder/decoder variants, pretraining, PEFT, RAG, multilingual models).
🧰 Tooling & Libraries
- Hugging Face Transformers & Datasets: model hub, tokenizers, pipelines.
- sentence-transformers: semantic embeddings and retrieval.
- spaCy, NLTK, Stanza, Flair: preprocessing, tagging, parsing.
- Vector DBs & Search: FAISS, Chroma, Milvus, Pinecone.
📚 Resources
- Natural Language Processing Specialization, Deeplearning.ai
- DeepLearning.AI AI courses slides and resources
- RNN, LSTM and GRU in Arabic
- Autoencoders in Arabic
- Books: Speech and Language Processing (Jurafsky & Martin), Natural Language Processing with Transformers.
- Books: Practical Natural Language Processing O'Reilly (our college reference book)
- Books: Natural Language Processing with Python O'Reilly (our college reference book)
Computer Vision (CV) is the field of AI that enables machines to interpret and understand visual information from the world, such as images and videos. This section covers core concepts, tasks, models, tooling, and practical considerations for building CV systems.
Actually I am not a big fan of computer vision, so I will share with the basics that I have learned. If you want to learn more about it, you can check online courses and resources.
🔹 Core Concepts
- Image Representation: Pixels, channels (RGB, grayscale), resolution, aspect ratio.
- Feature Extraction: Edges, corners, textures, shapes, keypoints.
- Image Processing: Filtering, convolution, morphological operations, color spaces (RGB, HSV, LAB).
- Geometric Transformations: Scaling, rotation, translation, affine transformations.
- Image Enhancement: Contrast adjustment, noise reduction, sharpening.
🔧 CV Tasks
- Image Classification: Assigning labels to images (e.g., cat vs dog).
- Object Detection: Locating and classifying objects in images (bounding boxes).
- Segmentation: Pixel-level classification (semantic, instance, panoptic).
- Image Generation: Creating new images from scratch or editing existing ones.
- Pose Estimation: Detecting human or object poses and keypoints.
- Optical Character Recognition (OCR): Extracting text from images.
- Face Recognition & Analysis: Face detection, verification, emotion recognition.
- Video Analysis: Action recognition, tracking, scene understanding.
🧠 Models & Architectures
- Classical Methods: Haar cascades, HOG + SVM, template matching.
- Deep Learning Models: CNNs, ResNet, EfficientNet, VGG, MobileNet, YOLO, SSD, Faster R-CNN.
- Generative Models: GANs, VAEs, Diffusion Models for image generation.
- Vision Transformers (ViT): Transformer-based architectures for vision tasks.
- Foundation Models: CLIP, DALL-E, Stable Diffusion.
🧰 Tooling & Libraries
- OpenCV: Core library for image processing and computer vision tasks.
- PyTorch & TensorFlow: Deep learning frameworks with vision modules.
- Hugging Face Transformers: Pretrained models for vision tasks.
- Scikit-Image: Image processing in Python.
📚 Resources
In this section, you will learn how to convert raw, messy data into clean, structured, and model-ready data. Good data preparation often improves your model more than changing the algorithm itself, this is most important topics to study.
🧹 Data Cleaning (First Step Always)
- Handling missing values (**drop**, **fill**, **interpolate**)
- Removing duplicates
- Fixing incorrect data types (string → number/date)
- Detecting and treating outliers
- Cleaning inconsistent labels (e.g., Male/male/M)
🔄 Data Transformation
- Feature scaling: Normalization, Standardization
- Encoding categorical variables: Label Encoding, One-Hot Encoding
- Basic feature engineering: Creating new features, combining columns, feature selection
⚖️ Handling Imbalanced Data
- Oversampling
- Undersampling
- SMOTE (Synthetic sampling)
- Using class weights
📊 Data Splitting & Validation
- Train / Validation / Test split
- Stratified splitting
- K-Fold Cross-Validation
- Avoiding data leakage
🎨 Data Augmentation
- Image augmentation (flip, rotate, crop, noise)
- Text augmentation (synonyms, paraphrasing)
- Audio augmentation (noise, pitch shift, time stretch)
💾 Working with Large Datasets
- Data generators / batch loading
- Efficient formats (CSV)
Libraries:Pandas & NumPy, Scikit-learn, TensorFlow data loaders
Data Science involves extracting insights from data through analysis and visualization. This section covers the basics of exploring, analyzing, and visualizing data to inform decision-making. most of the topics if reach here that means you are already familiar with them.
🔍 Exploratory Data Analysis (EDA)
- Data Inspection: Checking data types, missing values, duplicates, summary statistics.
- Univariate Analysis: Distributions, outliers, central tendency (mean, median, mode).
- Bivariate/Multivariate Analysis: Correlations, relationships between variables.
- Data Profiling: Understanding data structure and quality.
Data Types & Categorical Variables
- Numerical Data: Continuous vs discrete, scaling, normalization.
- Categorical Data: Nominal vs ordinal, encoding techniques (label, one-hot).
- Datetime Data: Parsing, extracting features (year, month, day), handling time zones.
📈 Data Visualization
- Chart Types: Bar charts, line plots, scatter plots, histograms, box plots, heatmaps.
- Best Practices: Choosing the right visualization, avoiding misleading charts, color theory.
- Interactive Visualizations: Basic dashboards with libraries like Plotly.
🧰 Tools & Libraries
- Pandas: Data manipulation and analysis.
- NumPy: Numerical computing.
- Matplotlib & Seaborn: Static visualizations.
- Plotly & Bokeh: Interactive visualizations.
- Jupyter Notebooks: For exploratory analysis.
📚 Resources
- Data Analysis with Python, Coursera
- Kaggle profile (my projects and notebooks)
Understand the core technology behind modern AI systems like ChatGPT, Gemini, Claude, and open-source LLMs.
🔹 Transformers — Basics
- Tokens & tokenization
- Embeddings & positional encoding
- Self-attention & multi-head attention
- Encoder vs Decoder architecture
- Feed-forward layers
- Why Transformers replaced RNNs/CNNs in NLP
- Popular models: BERT, GPT, T5
🚀 Large Language Models (LLMs)
- Pretraining (next-token prediction)
- Instruction tuning
- RLHF / alignment techniques
- Scaling laws & model sizes
- Context windows & long-context models
- Inference basics (sampling, temperature, top-k/top-p)
⚙️ Fine-Tuning & Customization
- Prompt engineering vs fine-tuning
- Full fine-tuning
- Parameter-efficient methods (LoRA, QLoRA, adapters)
- Instruction / supervised fine-tuning (SFT)
- Retrieval-Augmented Generation (RAG)
- Domain adaptation & evaluation
🛠️ Applications & Practical Usage
- Chatbots & assistants
- RAG systems & knowledge search
- Summarization & question answering
- Code generation
- Agents & tool calling
- Deployment & optimization (quantization, batching, GPUs)
- Open-source ecosystem: Hugging Face, vLLM, Ollama
Learn how to build intelligent systems that can reason, plan, use tools, and interact autonomously with users and environments.
📝 Prompt Engineering
- Understanding prompts & instructions
- Prompt structure (system / user / tools)
- Types of prompting: zero-shot, few-shot, chain-of-thought
- Role prompting & persona design
- Output formatting (JSON, structured outputs)
- Prompt optimization & evaluation
- Guardrails & prompt safety
🧠 Agentic Concepts (Core Building Blocks)
- Environment & context handling
- Short-term vs long-term memory
- Memory stores (vector DB, cache, databases)
- Persistence & state management
- Tool / function calling/ enable-auto-tools choice in VLLMs and SGLang
- Planning & reasoning loops
- Reflection & self-correction
- Social ability & human-in-the-loop
⚙️ Agent Architectures & Patterns
- Reactive agents
- Deliberative agents
- Hybrid agents
- ReAct (Reason + Act): ReAct Agent from Scratch Kaggle, RAG (ReAct Based), Advance ReAct Agent
- Plan-Execute
- Tree/Graph of Thoughts
- Tool-augmented agents
- Workflow / pipeline agents
- Multi-agent collaboration systems (MAS, Crew-style)
📚 Knowledge & Retrieval (RAG)
- Document chunking strategies
- Embeddings & vector databases
- Similarity search
- Retrieval-Augmented Generation (RAG)
- Hybrid search (keyword + vector)
- Context injection techniques
🛠️ Tools, Frameworks & Deployment
- LangChain, LlamaIndex
- CrewAI / AutoGen / multi-agent frameworks
- OpenAI function calling / tool APIs
- FastAPI / backend integration
- Async workflows & task queues
- Monitoring, logging, evaluation
- Cost optimization & latency control
🤖 AI agent frameworks
These are the most popular frameworks for building AI agents, and they are all open-source and free to use. I highly recommend learning at least one of them, as they will make your life much easier when building AI agents.
-
Langchain: for building AI agents with LLMs and tools :LangChain Documentation
-
LangSmith: for monitoring and evaluating AI agents :LangSmith Documentation
-
Langgraph: for building AI agents with graph-based reasoning and agents orchestration :Langgraph Documentation - AI Agents in Langgraph: DeepLearning.AI, Advanced AI Agents Tutorial: YouTube AI Agents in Langgraph: GitHub
-
CrewAI: for multi-AI agent systems: CrewAI Documentation, Multi-AI Agent Systems with CrewAI: DeepLearning.AI, CrewAI Crash Course: YouTube, How to use Ai Agents to do ALl your work: YouTube
🖥️ LLMs Servers & APIs providers
🔒 Safety, Evaluation & Reliability
- Hallucination reduction (model temperature)
- Output validation
- Guardrails & constraints
- Agent testing & benchmarking
- Human feedback loops
- Security & prompt injection protection
- FastAPI: Building the API Layer for AI Agents: Python FAST API Tutorial
- Chroma DB: The Ultimate Vector Database for AI Agents
- MongoDB: The NoSQL Database for AI Agents, MongoDB Essential Training LinkedIn Learning
- Redis: The In-Memory Data Store for AI Agents
- RabbitMQ: Message Brokering for AI Agents
- Agent Monitoring & Evaluation
- Evaluation Metrics: Measuring AI Agent Performance
- Monitoring Tools: Keeping an Eye on AI Agents in Action
- Cost Management: Optimizing Expenses for AI Agent Operations
(Saying Goodbye to Localhost and welcoming the World)
⚙️ Deployment Optimization
- Inference: batching, streaming, caching, quantization (int8, 4-bit),fp16
- Latency & Cost: model size vs latency trade-offs, distillation, pruning.
- Safety: input validation, hallucination mitigation, output filters, user feedback loops.
🐳 Containerization & Orchestration
- Docker: Containerizing AI Agents for Scalable Deployment, Docker Foundations Professional Certificate
- Nginx:Load Balancing and Reverse Proxy for AI Agent APIs (if building your own API from scratch without using a service).
🚀 High-Performance Inference Engines
- VLLM & SGLang: High-Performance Inference Engines for AI Agents - VLLM Docs, SGLang Docs
- Custom Inference Templates: Enabling auto-tool choice in VLLM (Tool Calling) and SGLang.
☁️ Cloud Platforms & GPU Providers
actually there are thousands of cloud providers, but these are what I've used over the past 3 years and recommend.
- GCP: Google Cloud Platform and its vertex AI services(Google Generative AI,model garden for open-source models)
- Digital Ocean:i use its vps for hosting small projects and APIs
- Heroku: Platform as a Service, it's good for hosting small projects
- Hosting.com & Hostinger: the cheapest hosting providers.
- Vast.ai & RunPod: GPU cloud providers when you need serve your LLM or any other models
But please, when dealing with hosting, keep your eyes on 3 things: 1. your visa 😭😭, 2. security and firewalls, 3. resource usage.
- GPUs: Accelerating AI Agents with Graphics Processing Units
🌟 Open to contributions — feel free to suggest improvements or additional resources!
This repository includes additional files with extra learning materials:
- 📱
ACCOUNTS_TO_FOLLOW.md- Recommended social media accounts for AI/ML content - 📺
CHANNELS_TO_FOLLOW.md- YouTube channels and content creators - 📚
EXTRA_RESOURCES.md- Additional learning materials and resources - 💡
PROJECTS_IDEAS.md- Practical project ideas to apply your learning
⚠️ Note: Resources in these additional files may include community contributions and materials I haven't personally verified or watched yet. The main roadmap above contains only resources I've personally used and recommend.
Contributions are welcome! If you have suggestions, resources, or improvements:
- 🍴 Fork the repository
- 🔨 Create your feature branch (
git checkout -b feature/AmazingResource) - 💾 Commit your changes (
git commit -m 'Add some AmazingResource') - 📤 Push to the branch (
git push origin feature/AmazingResource) - 🎯 Open a Pull Request
If this roadmap helped you in your AI learning journey, please consider:
- ⭐ Starring this repository
- 🔄 Sharing it with others
- 💬 Providing feedback or suggestions
Before we dive in — I'm Romani. Been freelancing since 2023, now building AI systems at Penny Software. Navigating the overwhelming world of AI learning pushed me to create this roadmap: a clear, practical path built on hands-on projects and verified resources, not endless theory.
|
AI Engineer @ Penny Software | AI/ML & Agentic AI Specialist 🎓 Graduate of Faculty of Computers and Artificial Intelligence, Benha University |
Have questions? Want to collaborate? Let's connect!
Made with ❤️ by Romani Nasrat
© 2026 Agentic AI Roadmap • MIT License
