The Operationalisation of GenAI

The-Operationalisation-of-GenAI

The operationalisation of GenAI is becoming significant across various industries. Vinoj Radhakrishnan and Smriti Sharma, Principal Consultants, Financial Services at Fractal, shared insights into this transformative journey, shedding light on how GenAI is being integrated into organisational frameworks, particularly in the banking sector, addressing scepticism and challenges along the way.

“GenAI has undergone an expedited evolution in the last 2 years. Organisations are moving towards reaping the benefits that GenAI can bring to their eco system, including in the banking sector. Initial scepticism surrounding confidentiality and privacy has diminished with more available information on these aspects,” said Sharma.

She noted that many organisations are now pledging substantial investments toward GenAI, indicating a shift from conservative avoidance to careful consideration.

Radhakrishnan added, “Organisations are now more open to exploring various use cases within their internal structures, especially those in the internal operations space that are not customer facing.

“This internal focus allows for exploring GenAI’s potential without the regulatory scrutiny and reputational risk that customer facing applications might invite. Key areas like conversational BI, knowledge management, and KYC ops are seeing substantial investment and interest.”

Challenges in Operationalisation

Operationalising GenAI involves scaling applications, which introduces complexities. “When we talk about scaling, it’s not about two or three POC use cases; it’s about numerous use cases to be set up at scale with all data pipelines in place,” Sharma explained.

“Ensuring performance, accuracy, and reliability at scale remains a challenge. Organisations are still figuring out the best frameworks to implement these solutions effectively,” she said.

Radhakrishnan emphasised the importance of backend development, data ingestion processes, and user feedback mechanisms.

“Operationalising GenAI at scale requires robust backend to frontend API links and contextualised responses. Moreover, adoption rates play a crucial role. If only a fraction of employees uses the new system and provide feedback, the initiative can be deemed a failure,” he said.

The Shift in Industry Perspective

The industry has seen a paradigm shift from questioning the need for GenAI to actively showing intent for agile implementations. “However,” Sharma pointed out, “only a small percentage of organisations, especially in banking, have a proper framework to measure the impact of GenAI. Defining KPIs and assessing the success of GenAI implementations remain critical yet challenging tasks.”

The landscape is evolving rapidly. From data storage to LLM updates, continuous improvements are necessary. Traditional models had a certain refresh frequency, but GenAI requires a more dynamic approach due to the ever-changing environment.

Addressing employee adoption, Radhakrishnan stated, “The fear that AI will take away jobs is largely behind us. Most organisations view GenAI as an enabler rather than a replacement. The design and engineering principles we adopt should focus on seamless integration into employees’ workflows.”

Sharma illustrates with an example, “We are encouraged to use tools like Microsoft Copilot, but the adoption depends on how seamlessly these tools integrate into our daily tasks. Employees who find them cumbersome are less likely to use them, regardless of their potential benefits.”

Data Privacy and Security

Data privacy and security are paramount in GenAI implementations, especially in sensitive sectors like banking.

Radhakrishnan explained, “Most GenAI use cases in banks are not customer-facing, minimising the risk of exposing confidential data. However, there are stringent guardrails and updated algorithms for use cases involving sensitive information to ensure data protection.”

Radhakrishnan explained that cloud providers like Microsoft and AWS offer robust security measures. For on-premises implementations, organisations need to establish specific rules to compartmentalise data access.

“Proprietary data also requires special handling, often involving masking or encryption before it leaves the organisation’s environment,” Sharma added.

Best Practices for Performance Monitoring

Maintaining the performance of GenAI solutions involves continuous integration and continuous deployment (CI/CD).

“LLMOps frameworks are being developed to automate these processes,” Radhakrishnan noted. “Ensuring consistent performance and accuracy, especially in handling unstructured data, is crucial. Defining a ‘golden dataset’ for accuracy measurement, though complex, is essential.”

Sharma added that the framework for monitoring and measuring GenAI performance is still developing. Accuracy involves addressing hallucinations and ensuring data quality. Proper data management is fundamental to achieving reliable outputs.

CI/CD play a critical role in the operationalisation of GenAI solutions. “The CI/CD framework ensures that as underlying algorithms and data evolve, the models and frameworks are continuously improved and deployed,” Radhakrishnan explained. “This is vital for maintaining scalable and efficient applications.”

CI/CD frameworks help monitor performance and address anomalies promptly. As GenAI applications scale, these frameworks become increasingly important for maintaining accuracy and cost-efficiency.

Measuring ROI is Not So Easy

Measuring the ROI of GenAI implementations is complex. “ROI in GenAI is not immediately apparent,” Sharma stated. “It’s a long-term investment, similar to moving data to the cloud. The benefits, such as significant time savings and reduction in fines due to accurate information dissemination, manifest over time.”

Radhakrishnan said, “Assigning a monetary value to saved person-hours or reduced fines can provide a tangible measure of ROI. However, the true value lies in the enhanced efficiency and accuracy that GenAI brings to organisational processes.”

“We know the small wins—saving half a day here, improving efficiency there—but quantifying these benefits across organisations is challenging. At present, only a small portion of banks have even started the journey on a roadmap for that,” added Sharma.

Sharma explained that investment in GenAI is booming, but there is a paradox. “If you go to any quarterly earnings call, everybody will say we are investing X number of dollars in GenAI. Very good. But on the ground, everything is a POC (proof of concept), and everything seems successful at POC stage. The real challenge comes after that, when a successful POC needs to be deployed at production level. There are very few organisations scaling from POC to production as of now. One of the key reasons for that is uneasiness on the returns from such an exercise – taking us back to the point on ROI.”

“Operational scaling is critical,” Radhakrishnan noted. “Normally, when you do a POC, you have a good sample, and you test the solution’s value. But when it comes to operational scaling, many aspects come into play. It must be faster, more accurate, and cost-effective.”

Deploying and scaling the solution shouldn’t involve enormous investments. The solution must be resilient, with the right infrastructure. When organisations move from POC to scalable solutions, they often face trade-offs in terms of speed, cost, and continuous maintenance.

The Human Element

Human judgement and GenAI must work in harmony. “There must be synergy between the human and what GenAI suggests. For example, in an investment scenario, despite accurate responses from GenAI, the human in the loop might disagree based on their gut feeling or client knowledge,” said Radhakrishnan.

This additional angle is valuable and needs to be incorporated into the GenAI algorithm’s context. A clash between human judgement and algorithmic suggestions can lead to breakdowns, especially in banking, where a single mistake can result in hefty fines.

Data accuracy is obviously crucial, especially for banks that rely heavily on on-premises solutions to secure customer data.

“Data accuracy is paramount, and most banks are still on-premises to secure customer data. This creates resistance to moving to the cloud. However, open-source LLMs can be fine-tuned for on-premises use, although initial investments are higher,” added Sharma.

The trade-off is between accuracy and contextualisation. Fine-tuning open-source models is often better than relying solely on larger, generic models.

Future Trends

Radhakrishnan and Sharma both noted that the future of GenAI in banking is moving towards a multi-LLM setup and small language models. “We are moving towards a multi-LLM setup where no one wants to depend on a single LLM for cost-effectiveness and accuracy,” said Sharma.

Another trend she predicted is the development of small language models specific to domains like banking, which handle nuances and jargon better than generalised models.

Moreover, increased regulatory scrutiny is on the horizon. “There’s going to be a lot more regulatory scrutiny, if not outright regulation, on GenAI,” predicted Radhakrishnan.

“Additionally, organisations currently implementing GenAI will soon need to start showing returns. There’s no clear KPI to measure GenAI’s impact yet, but this will become crucial,” he added.

“All the cool stuff, especially in AI, will only remain cool if the data is sound. The more GenAI gathers steam, the more data tends to lose attention,” said Sharma, adding that data is the foundation, and without fixing it, no benefits from GenAI can be realised. “Banks, with their fragmented data, need to consolidate and reign-in this space to reap any benefits from GenAI,” she concluded.

The post The Operationalisation of GenAI appeared first on AIM.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments

Latest stories

You might also like...