Become a sponsor to Aniket Maurya
Building Celesto AI — Developer-first platform to deploy long-running agents and MCP servers, with built-in observability.
Previously at Lightning AI, I spent my time as a Research Engineer and Developer Advocate. I led the development of LitServe, an open-source LLM inference framework that improved throughput by 50% and powered real-time apps with thousands of concurrent users. Alongside engineering, I grew Lightning’s developer ecosystem — shipping tutorials, writing technical blogs, and building agentic AI demos for customers and community that helped thousands of devs adopt and scale with Lightning’s platform.
Before that, I built large-scale systems at Quinbay (part of Blibli.com) (computer vision and onboarding pipelines that cut hours of processing to seconds) and Coviam (deep learning models serving 100K+ RPM).
🛠 Stack & focus areas: Python, PyTorch, FastAPI, React, Postgres, RAG, Generative AI, Triton, inference optimization.
🌍 Open source: My projects (LitServe, GradsFlow, Chitra, ZeroZen) are used by thousands of developers worldwide.
I’m passionate about creating AI systems that don’t just demo well, but run at scale — and turning those systems into products people love.
Featured work
-
Lightning-AI/LitServe
A minimal Python framework for building custom AI inference servers with full control over logic, batching, and scaling.
Python 3,816 -
CelestoAI/agentor
Fastest way to build and deploy reliable AI agents, MCP tools and agent-to-agent. Deploy in a production ready serverless environment.
Python 161