How IBM’s Watson X is Scripting the End of Business As We Know it

IBM has joined the generative AI market with its suite of AI tools designed for enterprises. Banking on its strength as an enterprise-focused company, and with many organisations already using its Watson chatbot, the company is looking to revive its former glory with this next-generation technology.

Known as WatsonX, the platform enables enterprises to design and customise large language models (LLMs) as per their operational and business needs. Watsonx comes with a suite of tools for tuning LLMs, a data store built on lakehouse architecture, and an AI governance toolkit. IBM foresees the use of this platform in areas like conversing with customers and employees, streamlining business workflows, automating IT processes, enhancing security, and meeting sustainability objectives.

An example of Watsonx foundational models fused into software products is Watson Code Assistant. Currently, the tool is focused on increasing developer productivity for IT automation, but IBM looks to expand it to other domains such as content discovery, code optimisation, and explanation of code.

Personalisation is the way

When Watson first arrived, it was supposed to change everything. But over the course of a decade, IBM’s artificial intelligence couldn’t achieve its grand vision to remake industries or generate riches for companies. As it turns out, the problems Watson hoped to solve, from tackling cancer to climate change, were a lot harder than anticipated, The New York Times reported.

It seems like this time around IBM has resisted the urge to make and invest in overarching claims, and instead focused on more narrow use cases. “Watson Code Assistant can run on a private vs. public instance, and retraining doesn’t require sharing code with IBM. That’s the paradigm most businesses are waiting for,” said Vineet Vashishta, Founder & CDO at V Squared.

Organisations seek to retrain their generative AI code assistants on their company’s repositories, resulting in personalised and accurate code that meets their team’s specific needs. Vashishta explains that this is precisely why most businesses would not sign up for code assistants using GPT technology today, as exposing proprietary code is a non-starter.

In this regard, personalisation will be the driving force behind most generative AI applications. “IBM is the first company to address that, and it won’t be the last,” he says. But, as it reaches scale, Vashishta makes an interesting observation: “Over the next year, the costs of personalising models and running private instances will drop. Generative models will go from customised to the business to customised to individuals. That’s when we’ll see some truly innovative solutions.”

IBM versus the world

We have already seen alternative developments made by other enterprise-focused companies as well. For instance, in a recent blog post, Microsoft seemed to explain how ChatGPT can work with enterprise data. Azure OpenAI service combined with Azure Cognitive Search can help organisations index and retrieve data, private and external to ChatGPT. Along with ChatGPT, the Azure OpenAI service is already offering products like the code-generating Codex and the image-generating DALL-E 2.

At the recent GTC event, we also saw the unveiling of the NVIDIA NeMo service, which will help enterprises combine LLMs with their proprietary data. Likewise, Amazon Bedrock is a newly launched service by AWS that offers various foundational models to enable businesses to develop and customise their own generative AI applications.

“Among the Watson X announcements, AI studio and data store won’t move the needle much. They are catching up, if that, with other vendors who have been offering better AI data stores and data lakes for many years,” Andy Thurai, principal analyst at Constellation Research, told InfoWorld.

IBM’s direct competitors including SAP, Oracle, and Accenture are all accelerating their efforts to bring generative AI solutions to its clients. As goes with all technology that is hot, the cloud market with generative AI is going through a market clutter. At the same time, a new trend is emerging, similar to what has been observed in the chip industry, known as “coopetition.” Under this model, competitors collaborate with each other on several areas to offer solutions. In this light, SAP has announced its plan to embed IBM’s Watson AI technology into its applications.

This begs the question: Will IBM Watson be able to keep up with its competition or will it crumble…again?

Here, it is important to remember that if and when quantum computing achieves an advantage for enterprises, IBM’s early investment in it has already placed it several years ahead of its competitors.

The quantum bet

During Think2023, IBM CEO Arvind Krishna emphasised that there is much more to come in the technological landscape beyond AI and hybrid cloud, referring to the impact quantum computing will have. When this technology becomes a decisive factor in gaining a competitive edge, IBM will already be ahead of the curve, as it has a head start with a real quantum computer boasting over 400 qubits, running on the cloud.

Read: Quantum Computing Meets ChatGPT

The company has collaborated with research institutes and quantum computing firms to develop new capabilities and launch courses to upskill individuals in this domain, thus enabling the production of talent to match up to the pace of progress.

“Cloud and AI are feeding onto each other, and their combination has led us to achieve an enormous advantage. Without cloud computing, the development of AI would have been much slower. However, by integrating these two technologies with quantum computing, we are poised to reach a remarkable inflection point in this decade,” said Krishna.

The post How IBM’s Watson X is Scripting the End of Business As We Know it appeared first on Analytics India Magazine.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Inline Feedbacks
View all comments

Latest stories

You might also like...