Brands
Discover
Events
Newsletter
More

Follow Us

twitterfacebookinstagramyoutube
Youtstory

Brands

Resources

Stories

General

In-Depth

Announcement

Reports

News

Funding

Startup Sectors

Women in tech

Sportstech

Agritech

E-Commerce

Education

Lifestyle

Entertainment

Art & Culture

Travel & Leisure

Curtain Raiser

Wine and Food

YSTV

ADVERTISEMENT
Advertise with us

How RagaAI is finetuning AI’s off-key moments

Founded by former NVIDIA and Ola executive Gaurav Agarwal, RagaAI detects and diagnoses issues related to AI models to enhance their reliability and safety in the real world.

How RagaAI is finetuning AI’s off-key moments

Thursday April 25, 2024 , 7 min Read

Five years ago, Gaurav Agarwal, a techie, was driving a semi-autonomous car on a stormy night in San Francisco. After some time, he ran into some debris on the road, but the vehicle’s AI system failed to detect it. Hence the car did not slow down or react in time. 

Fortunately, Agarwal spotted the debris and realised that the AI system had had a malfunction. He quickly applied the brakes manually, thus averting a major accident. 

AI’s failure at a critical juncture prompted Agarwal to take a hard look at how the ecosystem could evolve to prevent system mishaps and failures of this kind—which happen due to changes in the environment, unpredictable debris characteristics, and limitations of sensors and cameras. 

“I realised that if you don’t do a systematic, technology-driven effort to make sure AI does not fail, it could really hold back all the good things it can do,” notes Agarwal. 

With AI innovations becoming potent forces in the world of technology, changing the face of businesses worldwide, it’s important to ensure quality and consistency of AI applications. 

It is with this belief that RagaAI was born in 2022. The startup provides an automated testing platform designed for all types of AI, including generative AI (GenAI), to ensure safe and reliable AI in production.   

The name ‘RagaAI’ is derived from the Hindi word ‘raga’, which means ‘tune’, and stands for the process of fine-tuning AI models to make them trustworthy. 

As Agarwal went about setting up the platform, he tried to understand why AI sometimes fails and models like GPT-3 provide incorrect answers. In the process, he realised that ensuring that AI was foolproof was overly complex—due to the static nature of AI models and their limited adaptability to advancements. 

The key is to keep AI systems agile and up-to-date and test them regularly and adequately to prevent serious consequences. Systems that are not thoroughly vetted are less reliable and potential issues may go undetected. 

“To tackle this, we have developed a robust testing and compliance platform for all kinds of AI systems,” says Agarwal, who spent the early part of his career at Texas Instruments. 

He has also worked at Olaand NVIDIA, leading software development in autonomous vehicles.

Fine-tuning AI systems in vehicles

The San Francisco-based startup—which also has operations in Bengaluru, serves mid-sized and large customers in the United States, Europe, and India in the automotive, aerospace, retail, insurance, geospatial imaging, pharmaceuticals, and healthcare (medical imaging/devices) sectors. 

Its team in Bengaluru is focused on ensuring that AI-driven vehicle detection systems identify obstacles, including pedestrians, vehicles and other objects, with enhanced precision, and operate accurately in challenging conditions such as rain, low visibility, and night-time settings. 

The goal is to ensure AI systems in vehicles are proactive and offer improved safety. 

Agarwal explains that, compared to traditional systems, AI systems can quickly analyse complex data from various sensors and provide precise, real-time warnings to drivers, without human intervention. AI algorithms can also keep learning and getting better over time, he adds.  

autonomous cars

Also Read
AI testing platform RagaAI secures $4.7M in seed funding

AI testing 

The startup has created a framework to ensure the safety and dependability of AI. Its foundational model—RagaAI DNA—uses automated processes to identify, diagnose, and rectify issues during testing.

The startup’s AI testing platform offers over 300 tests, which identify issues that can negatively impact the performance of machine learning models. Before the testing platform becomes part of the real world, it’s put into a simulated environment to see how well it works. 

The platform supports various data types such as large language models (LLMs), images, videos, 3D, and audio. It claims to reduce risks such as security vulnerabilities and biased output by 90% and accelerate AI development by over three times. 

While the platform can be set up as an API, for deeper integration, it’s better to use it as software or a suite of tools specifically designed for simulation and testing, says Agarwal. 

In the AI and LLM testing space, RagaAI competes with players such as Switzerland-based LatticeFlow and US-based Modl.ai. 

Safety measures

The startup also offers ‘RagaAI LLM Hub’—an open-source platform to evaluate and establish guardrails for LLMs and retrieval augmented generation (RAG) applications. 

RAG refers to enhancing a language model’s output by checking an external knowledge base, such as Google, before generating a response.

Guardrails are safety measures and controls that prevent unwanted or harmful outcomes within a system. They also ensure that sensitive data is not leaked during responses. 

The platform has over 30 types of guardrails within its LLM evaluation hub, and more are in the pipeline. These include a feature called anonymisation, which encrypts personal data using natural language processing (NLP), and measures such as secrets protection and vulnerability scanners. 

RagaAI is also collaborating with regulators in the United States, Europe, and India to make sure the AI/ML systems are certified based on the guidelines and rules prescribed by the regulators. 

Also Read
With enterprise solutions, Ema AI is making space for itself in the crowded Gen AI market

Tackling hallucinations and bias

RagaAI also helps clients tackle hallucinations (responses containing false or misleading information presented as facts) and bias in AI systems.

Recently, the startup helped an e-commerce client in pinpointing and correcting hallucinations arising in its chatbot, thereby reducing response errors.

Let’s say a user seeks assistance as the laptop he has bought doesn’t start. If the chatbot insists that he buys an HDMI cable for a screen problem, it ends up providing an irrelevant solution based on faulty understanding of the issue. These are the kind of hallucinations the e-commerce firm wants to reduce. 

In another instance, the startup helped a client in the US reduce bias while recruiting resources. 

When asked to recommend resumes for doctors, the generative AI tool predominantly selected resumes of male candidates, reflecting a biased thought process. 

“We significantly reduced errors by over 90% from previous levels and we are on our path to make it zero,” Agarwal explains.

Challenges and road ahead

Earlier this year, RagaAI secured $4.7 million in a seed funding round led by Pi Ventures. The investment round also saw participation from notable global investors, including Anorak Ventures, TenOneTen Ventures, Arka Ventures, Mana Ventures, and Exfinity Venture Partners

RagaAI plans to use the funds to ramp up R&D and expand into Southeast Asia and other regions. 

As it embarks on an expansion drive, it has its task cut out. 

Getting customers on board is its biggest challenge. Convincing enterprises that AI needs to be finetuned and cannot be used is not easy as many of them have limited understanding of AI, its complexities, and the need for testing and refinement.

“Many clients think AI is flawless and are resistant to fine-tuning. They view AI as a plug-and-play solution and are reluctant to accept the need for customisation,” Agarwal points out. 

Clients are also resistant to change or continuous testing after the AI solution is deployed. Besides these challenges, there’s also growing competition from DIY solutions in the market. 

“Clients consider developing in-house solutions (using DIY kits) instead of relying on external vendors,” he adds. 

As it tackles these hurdles, RagaAI can draw comfort from the fact that the AI market—in India and globally—is set for an inevitable explosion. 

In India, the AI market is expected to reach $17 billion by 2027, growing at a CAGR of 25-35% between 2024 and 2027, says a Nasscom-BCG Report. Globally, the AI market, which is currently valued at $208 billion, could reach nearly $2 trillion by 2030 (Statista).


Edited by Swetha Kannan