How Cloudera empowers OCBC to integrate AI at scale
- During Cloudera’s Evolve in New York, executives discussed how the company helped OCBC Bank to integrate AI at scale across the organization.
- The 91-year-old bank is one of the first in the world to deploy generative AI tools at scale.
- It refuses to use Open API, so it retains control of input and output.
This month, OCBC Bank, following a six-month trial, introduced OCBC ChatGPT, making it the first bank globally to embrace generative AI on a large scale. The bank’s head of group data office, Donald MacDonald, said OCBC is deploying the chatbot for 30,000 staff across 19 countries. The chatbot, created in collaboration with Microsoft Azure, is set to assist employees at the bank’s 420 branches and offices worldwide in tasks such as writing, research, and ideation starting this month.
While OCBC ChatGPT is just one part of the bank’s overall effort involving generative AI, OCBC is said to be using or piloting four other functions. These are broadly categorized as: “Wingman,” which helps its team of coders write code; “Whisper,” which transcribes voice calls and makes summaries for its contact centers; “Buddy,” which pulls out information from 150,000 pages of company documents and records meetings for staff; and “Document AI,” which summarizes documents such as financial reports.
The Singapore-headquartered bank wants to use generative AI to personalize customer interactions, propose stock buys, and detect fraud and suspicious transactions for the bank. According to MacDonald, AI already makes more than four million daily decisions on risk management, customer service, and sales for the bank. Ultimately, he expects that number to surge to 10 million by 2025 when generative AI takes over more functions.
But behind all of these capabilities is a platform that enables it all.
OCBC and Cloudera’s AI collaboration
The challenge for enterprises in embracing generative AI lies in the fact that organizations must grant third-party AI tools access to their specialized knowledge and exclusive data for a model to provide precise outputs. However, there’s a risk of exposing confidential data without proper precautions.
This emphasizes the importance of optimal hybrid data management for organizations utilizing third-party AI solutions alongside proprietary data. At OCBC, a hybrid cloud platform like Cloudera’s has been helping the bank gain value from AI and ML for years. “Their [OCBC] success got us excited. I don’t think we expected that. Those guys used our platform, tore it apart, pushed it to a limit, integrated with other ecosystems, and created their platform,” Remus Lim, Cloudera’s VP of Asia Pacific and Japan, told Tech Wire Asia on the sidelines of Evolve, New York.
Lim explained how, before generative AI became the talk of the town, OCBC was already working on it as early as a few years ago. A check on Cloudera’s blog shows that in 2015, OCBC began a multi-phased initiative with Cloudera, focused on giving customers access to its banking services through an easy, convenient user interface that delivered targeted and tailored products and services.
“They started five years back, setting up AI Lab and recruiting the right people. Today, they have about 200 data scientists,” Lim noted. OCBC eventually migrated to the Cloudera data platform (CDP) and CDP machine learning in 2022, to power several solutions that have increased operational efficiency, enabled new revenue streams, and improved risk management.
“One of the key things OCBC has emphasized is that it doesn’t use Open API, because it can’t control what goes out and what users key in. That means the OCBC GPT is confined within a very secure and controlled on-premises environment. And that sits on our platform, the CDP, and our machine learning,” Lim told TWA.
According to Cloudera, what OCBC did was build a single entry point for all its LLM use cases: a hybrid framework that could seamlessly integrate multiple data sources, including inputs from thousands of customers and a private cloud data lake that would keep customer data safe, to get real-time insights customized to its company standards.
The bank built prompt microservices for accessing LLMs stored on its on-premises servers as well as LLMs available in the public cloud: a cost-effective model that allowed it both to use public cloud LLMs and to host open-source LLMs, depending on the functionality and customization it needed. By deploying and hosting its code assistant, scaled for 2,000 users, OCBC saved 80% of the cost of using SaaS solutions.
The platform integrates with the bank’s ML operations pipelines and fits into its larger ML engineering ecosystem. This cloud-based ML-powered platform lets OCBC build its applications and use the tools and frameworks its data scientists choose. OCBC could also, with ML models, send over 100 different personalized nudges on its mobile banking app, notifying customers about financial opportunities, including eligibility for a new credit card or loan—achieving up to 50% click-through rates.
The initiative has led to a more personalized customer experience, higher campaign conversion rates, faster transactions, reduced downtime for data centers, and an additional SGD 100 million (US$75 million) in revenue a year.
READ MORE
- 3 Steps to Successfully Automate Copilot for Microsoft 365 Implementation
- Trustworthy AI – the Promise of Enterprise-Friendly Generative Machine Learning with Dell and NVIDIA
- Strategies for Democratizing GenAI
- The criticality of endpoint management in cybersecurity and operations
- Ethical AI: The renewed importance of safeguarding data and customer privacy in Generative AI applications