"Artificial intelligence is a marathon, not a sprint." - Demis Hassabis, co-founder of Google DeepMind An ambitious vision, an ear...
"Artificial intelligence is a marathon, not a sprint." - Demis Hassabis, co-founder of Google DeepMind
An ambitious vision, an early reality check
Krutrim was unveiled in 2023 by Bhavish Aggarwal of Ola with a sweeping promise to build India’s own end to end artificial intelligence stack, from foundational models and chips to cloud infrastructure. The vision was bold and timely, arriving as global AI development accelerated at record speed. Less than two years later, that ambition is facing serious execution challenges.
Timelines slip as priorities shift
Current and former employees point to repeated delays in research and development timelines and growing uncertainty around product launches. Several internal projects were deprioritised or quietly shelved as resources were stretched thin. The consumer facing Kruti app reportedly took a backseat, reflecting a broader struggle to align talent, compute and expectations.
Chip ambitions lose momentum
One of Krutrim’s most distinctive goals was building indigenous AI chips. Over the past year, several senior leaders exited, and the chip team has largely been disbanded, with only a small group retained. Former staff suggest that without deep capital, sustained research depth and long timelines, such efforts were hard to sustain.
Scale and capital create stark contrasts
Industry observers highlight a structural gap. Global leaders like OpenAI and Anthropic have raised tens of billions of dollars, enabling massive teams and long research cycles. Krutrim, despite raising $74.9 million and committing significant promoter capital, faces a scale mismatch that makes rapid foundational breakthroughs difficult.
A cautionary lesson for Indian AI builders
Krutrim’s journey underscores a familiar truth in deep tech. Speed without depth can become a constraint. Building foundational AI demands patience, focus and long term investment, especially when expanding across multiple complex verticals simultaneously.
Summary
Krutrim’s challenges highlight the gap between ambitious AI visions and the slow, resource intensive work of execution. Delays, leadership exits and scaled back chip plans reflect how foundational AI requires sustained capital, talent and time to mature.
Food for thought
Can India realistically build foundational AI platforms without first narrowing its execution and capital gaps?
AI concept to learn: Foundational Models in AI
Foundational models are large AI systems trained on vast datasets that can be adapted to many tasks. They require massive computing power, high quality data and long training cycles. Their development is expensive but forms the backbone of modern AI applications.
[The Billion Hopes Research Team shares the latest AI updates for learning and awareness. Various sources are used. All copyrights acknowledged. This is not a professional, financial, personal or medical advice. Please consult domain experts before making decisions. Feedback welcome!]

COMMENTS