The GenAI pie: What's cooking in the oven?
India's top tech leaders discuss the potential of Gen AI and its various use cases, and spotlight what the future holds.
Generative AI or GenAI, as it is called, is being rapidly deployed by organisations to automate and enhance various business functions – from content generation and management to boosting employee productivity.
According to a McKinsey Global Survey on AI conducted in 2024, 65% of respondents reported that their organisations are regularly using the technology. A whopping three-quarters of 1,363 participants surveyed predicted that GenAI would lead to significant change in the years ahead.
The tremendous potential of AI and its use cases were discussed in the panel discussion ‘The GenAI piece: What's cooking in the oven?’ at YourStory’s India Tech Leaders’ Conclave 2024 held in Bengaluru on June 21, 2024.
The panelists included Prasanth MLNPP, Vice President, Growth and Strategy, Exotel, and Bhanu Jamwal, Head of Presales and Solutions Engineering, APAC, PingCap. The discussion was moderated by Rishabh Mansur, Head, Community, YourStory Media.
Making AI adoption simpler
The session began by pondering over how enterprises are using AI, attempting to understand the underlying technology and infrastructure provided by different companies to make its adoption simpler.
Since Prasanth’s experience lies in working with several digital companies, he believes they are far more ready than traditional organisations when it comes to AI adoption. “Digital natives want to build the technology in-house while traditional ones want to buy or partner with others to implement AI innovations. Besides, digital natives have data more readily available for AI.”
Speaking about use cases, Prasanth pointed out how digital natives are using the technology for performance marketing, including what should be the cost per click, how much one should bid for each keyword, and content marketing. Gen AI is also being heavily used for blog automation, and image and video creation.
“The entire decision-making process, which was earlier the domain of seasoned managers, is now being addressed by AI. We have seen that it removes the bias that managers may have had while making those decisions,” Prasanth said.
Adding to this, Jamwal pointed out that AI has been around for a long time but has gained more prominence today. In the 2010s, new technologies emerged, which were termed big data. At that point, there were ad hoc AI models that were more traditional.
Today, the story is completely different.
“With digital natives, the difference experienced is their scale. The rapid increase in their business generates more data. When we have more data coming in, there are equal challenges in terms of how do you run these AI models based on this data and is your underlying infrastructure and data platform supporting it?” Jamwal said.
Similarly, with chatbots, traditional ones were more information-driven. Today, AI chatbots are quite smart and driven by natural language processing (NLP).
Focus on data-centric innovations
The future of Gen AI rests on the implementation of data-centric innovations. AI needs data for training but where one gets the data from is an equally important consideration, pointed out Prasanth.
Interestingly, AI can annotate a lot of data and feed it to models. Giving an example, Prasanth shared how training a voice bot has now become simpler since AI is helping go through past recordings to generate data. “AI is helping feed data into AI again,” he added.
As someone who has worked closely with data companies, Jamwal feels there's been massive innovation. Going back two decades ago, he highlighted how traditional databases were being used before the emergence of big data technology in the early 2010s.
“With so much data, we started witnessing the advent of AI. If we see a majority of established AI companies today, one thing common is their use of vector databases. This helps the large language models to create their models more efficiently and extract data at an early stage,” he said.
Highlighting digital native businesses and their working, Jamwal shared that they've seen some of their customers’ data needs grow from 1 TB to 150-200 TB in a short period.
However, it also brings up certain challenges, including the management of transactional workloads, that need to be addressed.
“Our vision is to simplify things and to provide a solution where you can run both your transactional workloads and real-time analytics, plus run your AI modeling on the same database. It will also help save costs and time,” Jamwal said.
India and its potential on the world stage
Closing the discussion, Jamwal said the scale and rapid development that is happening in India is commendable. It also brings in equal opportunities for more AI use cases.
Prasanth applauded the sheer number of end users in India who are helping drive this revolution. “The only way to do this efficiently and effectively is by rapidly deploying AI across several use cases. The Indian tech ecosystem is probably at par or next to the US in terms of tech adoption,” he concluded.