Krutrim, an Indian AI startup backed by SoftBank, will receive $230 million from a billionaire

Published by Pratik Patil on

Ola founder Bhavish Aggarwal is making a significant investment of $230 million into his artificial intelligence startup, Krutrim, as India ramps up efforts to carve out a space in an industry largely led by U.S. and Chinese tech firms. The funding is being funneled primarily through Aggarwal’s family office, according to a source familiar with the matter. In a post on X, he revealed plans to secure a total investment of $1.15 billion for Krutrim by next year, with the remaining capital expected to come from external investors.

The funding announcement comes as Krutrim, which specializes in developing large language models (LLMs) tailored for Indian languages, has decided to make its AI models open source. Additionally, the company has unveiled plans to construct what it claims will be India’s most powerful supercomputer, collaborating with Nvidia to bring this vision to life.

Krutrim recently introduced Krutrim-2, a 12-billion-parameter language model demonstrating exceptional performance in understanding and processing Indian languages. Sentiment analysis tests shared by the company indicate that Krutrim-2 achieved a score of 0.95, significantly outperforming competing models, which scored around 0.70. Additionally, it has demonstrated an 80% success rate in code generation tasks.

As part of its open-source initiative, Krutrim has made several specialized AI models available to developers, covering areas such as speech translation, text search, and image processing—all optimized for India’s linguistic diversity. Acknowledging the progress made over the past year, Aggarwal expressed optimism about India’s AI ecosystem evolving into a world-class player through community collaboration.

India has been working to position itself as a strong contender in the global AI race, currently led by firms from the U.S. and China. The recent debut of DeepSeek’s R1 reasoning model, built on a relatively modest budget, has made waves in the industry, prompting India to take note of its progress. In fact, the country has committed to hosting DeepSeek’s large language models on domestic servers. Krutrim’s cloud division began offering DeepSeek’s services on Indian servers last week.

To further refine AI models tailored to India’s linguistic landscape, Krutrim has developed an evaluation system called BharatBench, aimed at filling the gap left by existing benchmarks that predominantly focus on English and Chinese. The lab has also integrated a 128,000-token context window into its models, enabling them to handle longer and more complex conversations. Performance metrics published by the company highlight Krutrim-2’s exceptional capabilities, with grammar correction scores reaching 0.98 and multi-turn conversation performance at 0.91.

This investment follows the launch of Krutrim-1 in January, India’s first large language model, which had a 7-billion-parameter architecture. The much-anticipated deployment of Krutrim’s supercomputer, in partnership with Nvidia, is set for March, with further expansion scheduled throughout the year.