Brands
YSTV
Discover
Events
Newsletter
More

Follow Us

twitterfacebookinstagramyoutube
Yourstory

Brands

Resources

Stories

General

In-Depth

Announcement

Reports

News

Funding

Startup Sectors

Women in tech

Sportstech

Agritech

E-Commerce

Education

Lifestyle

Entertainment

Art & Culture

Travel & Leisure

Curtain Raiser

Wine and Food

Videos

ys-analytics
ADVERTISEMENT
Advertise with us

Ola’s Bhavish Aggarwal unveils Krutrim’s inaugural AI models and more

People can now sign up for the base model, with early access rolling out in batches starting today. The full open release of Krutrim is scheduled for January 2024, and Krutrim APIs will be accessible to developers from February 2024.

Ola’s Bhavish Aggarwal unveils Krutrim’s inaugural AI models and more

Friday December 15, 2023 , 5 min Read

Krutrim SI Designs, the brainchild of Ola Co-founder Bhavish Aggarwal, hit a new milestone as it pulled back the curtain on its inaugural tech innovation—Krutrim, a family of artificial intelligence (AI) models crafted specifically to cater to the unique requirements of the Indian ecosystem.

“AI will define the future paradigms of the economy and culture, and to become a true leader in the world, India needs to become a global leader in AI,” Aggarwal said during Krutrim’s launch event.

“India-first AI should be able to understand the uniqueness and the right cultural context. It needs to be trained on unique data sets specific to us. And on top of it all, it needs to be accessible to India, with India-first cost structures,” he added.

The base large language model (LLM)—dubbed Krutrim, which means 'artificial' in Sanskrit—is trained on two trillion tokens. The company claims that the LLM has the largest representation of Indian data ever used in training.

Most trained pre-trained models primarily use English tokens, with only a few Indic language tokens. Krutrim claims to have changed that—with the base model incorporating around 20 times more Indic tokens than any other model in existence.

According to the company, Krutrim—which was trained in three months—can outperform GPT-4 on Indic language performance. Krutrim, equipped with voice capabilities, can understand various Indian languages and provide distinctive responses that reflect a nuanced understanding of the Indian ethos.

Krutrim can understand 22 Indian languages and generate content in about 10, including Marathi, Hindi, Bengali, Tamil, Kannada, Telugu, Odia, Gujarati, and Malayalam.

Tokens serve as the sub-words employed in both input and output within these models, and through the manipulation of tokens, these models effectively understand and process language.

The company is also building Krutrim Pro, a very large multi-modal model that will have more sophisticated problem-solving and task-execution capabilities. It is expected to be released in the next quarter.

People can now sign up for the base model, with early access rolling out in batches starting today. The full open release of Krutrim is scheduled for January 2024, and Krutrim APIs will be accessible to developers from February 2024.

“All Ola group companies are already utilising Krutrim for a variety of internal tasks, including customer support, voice and chat, and customer sales calls,” Aggarwal said. 

The development of the AI models involves highly skilled computer scientists, both in Bengaluru and in San Francisco.

Krutrim SI Designs, Aggarwal's recent venture, was officially registered in April, according to documents filed with the Ministry of Corporate Affairs. The sole director listed for Krutrim is Krishnamurthy Venugopala Tenneti.

In October, Aggarwal’s latest venture secured $24 million in debt funding from Matrix Partners. The convertible debentures from Matrix are expected to convert into equity shares in the future, potentially coinciding with Krutrim SI Designs raising additional equity funding.

Aggarwal previously informed YourStory that a substantial portion of the funding for the new AI venture comes from his personal contributions. The founder also disclosed that he has acquired multiple companies within the silicon-AI domain.

Apart from designing AI models tailored to the Indian context, the new company will focus on the development of silicon chips and establish a cloud infrastructure for delivering tailored solutions to customers.

Krutrim’s launch follows Google’s introduction of its largest and most capable artificial intelligence model, Gemini, just a week ago.

Silicon and infrastructure

Krutrim is working on three core areas: applied AI and engineering, infrastructure, and silicon hardware and software. The goal is to develop cost-effective silicon systems and platforms tailored to India’s unique needs.

It has developed a novel architecture that incorporates multiple chiplets. Each chiplet—a small piece of silicon—specialises in specific functions. The company claims its system-on-package is very energy-efficient and designed with a cost structure affordable for India.

“The architecture is ready and now we are now marching on to implementation,” said Sambit Sahu, who along with Raguraman Barathalwar, leads the silicon hardware and software efforts at Krutrim.

Aggarwal’s new venture plans silicon prototypes next year, tape out by the following year's end, and engineering samples by mid-2025, culminating in a production-ready AI server by the end of 2025.

The firm is also developing infrastructure for AI. Given the energy demands of AI, Krutrim is working from the ground up, focusing on cooling technology, rack development (aiming for racks powering up to 200 kilowatts), creating networks supporting up to 100,000 GPUs in a cluster, and designing AI servers and entire data centres for future AI needs, explained Navendu Agarwal, who along with Arun Madhusudanan, leads the infrastructure efforts at Krutrim.

Typically, data centres use a metric known as PUE. In India, data centres often have a PUE of 1.5, meaning approximately 50% of the energy is wasted on top of the one unit used for useful computing, the Ola chief remarked, adding that Krutrim’s technology boasts a PUE of 1.1, reducing waste to only 10% of the energy used.

“AI is the soul, while silicon represents the hardware, and infrastructure serves as the body,” the Ola co-founder, said.

He added, “For our well-run AI computing story and business, you need to build the AI models, the infrastructure and the silicon, and you need to integrate them very tightly. And that’s our endeavour.”


Edited by Kanishk Singh