TWO Hai No. 1! 🚀

Indian AI startup TWO is quietly marking a mark in the fast-emerging artificial intelligence market. In an interview with AIM, founder Pranav Mistry revealed that the company generated$4 million in revenue this quarter and expected $20 million for next year.

This development comes as TWO AI’s SUTRA, a series of multilingual online GenAI models, added a new feather in its cap. The company claims it outperformed GPT-4o, Llama 3.1, and Indic LLMs, including G42’s Nanda, Sarvam’s OpenHathi and AI4Bharat’s Airavata, and leading in over 14 Indian languages.

Earlier this year, the company launched ChatSUTRA, a ChatGPT-like chatbot. Mistry shared that the platform currently has over 600,000 unique users.

Unlike other startups, TWO AI targets only big enterprise customers instead of pursuing the consumer market. “Jio is one of our major enterprise customers, and we also work with clients like Shinhan Bank and Samsung SDS in Korea,” Mistry said. He further revealed that the company has started partnering with companies like NVIDIA and Microsoft from a technology perspective, and is working with them as well.

“We are targeting India, Korea, Japan, and some parts of Southeast Asia, like Vietnam, specifically the central region. APAC (Asia-Pacific) is one of the key markets that we are always going to focus on,” Mistry added.

Recently, TWO hosted Mukesh Ambani, chairman of Reliance Industries, and Akash Ambani, chairman of Reliance Jio Infocomm, at their US office. Over a cup of tea, they discussed the evolving role of AI in India and beyond.

Without taking names, Mistry revealed that in India one of the largest banks and financial sectors is among the customers the company will onboard. “Our solutions are in high demand, particularly in industries like finance, services, and retail,” he said.

SUTRA’s business model focuses on providing high-touch, customised solutions for a select group of large enterprises. “We don’t need 100 customers,” said Mistry. “We need 10 good customers.” He explained that by using this method, they are reaching billions of customers, as these enterprises already have millions of customers.

“The focus is to become not the OpenAI of the world, not just an application layer company, but an AI solutions company, going after large enterprises and helping them solve problems with AI,” he added. He shares his goal of following a path similar to that of Palantir’s.

What’s Next?

Mistry revealed that the company’s next project is predictive AI. “Predictive AI is game-changing for these data-dependent industries. From manufacturing to finance, governance, and energy sectors, everyone can really leverage the decision-making capability of forecasting,” he explained.

The model is called Sutra Predict. Mistry pointed out that it is a small model which is trained on trillion data points and time series data entries. “The model is small because the architecture of this one is much easier than the text-based ones, and it is already showing great results in some particular domains that our customers are already trying.”

He explained that time series predictive models are a specific type of statistical model used to analyse and forecast data points that are collected over time. They are built to identify patterns, trends, and seasonal variations within the data to make predictions about future values.

Mistry explained that with the advent of transformers, models can now process and integrate any kind of data, as seen in predictive models like Google TimesFM and Amazon Chronos. Highlighting a real-world application, he shared that an EV battery diagnostic company in India is using Sutra Predict to identify fire risks by monitoring temperature and voltage fluctuations.

Tackling GPU challenge

“In India, no one had access to the level of GPU clusters we needed,” Mistry said. The SUTRA team overcame this limitation by porting their models to run on CPU clusters. Despite the challenges, he said the team was able to scale and serve up to 100 billion customers.

Moreover, Mistry shared that they were the first ones to catch on to the trend of 1-bit LLMs.

Notably, Microsoft recently introduced BitNet.cpp, an inference framework for 1-bit LLMs, enabling fast and efficient inference for models like BitNet b1.58.

Mistry said they have successfully adapted the SUTRA model to work with 1-bit weights, allowing it to run as a lightweight model on CPUs.

Moreover, in partnership with NVIDIA, the company launched SUTRA-OP, offering systems like the NVIDIA DGX (Deep GPU Xceleration) box equipped with powerful GPUs for demanding AI tasks.

For customers requiring lighter and more cost-effective solutions, SUTRA also provides its hardware options, including SUTRA OP2, OP4, and OP8, which are available for lease. “Customers are not purchasing this, they are leasing it from us. It’s a monthly lease for both the OP and the SUTRA solutions,” said Mistry.

The company recently launched a voice-to-voice AI model called Sutra HiFi. Using a dual diffusion transformer architecture, the model effectively separates distinct voice tones from language-specific accents, promising a better voice interaction quality.

“Sutra HiFi brings the ability to interpret conversations seamlessly in languages that we care about. Currently, it supports 12 languages that we have tested properly,” Mistry said. He suggested that Sutra HiFi can easily empower applications in India or any multilingual market while keeping the cost low and accuracy high.

Discussing Infosys co-founder Nandan Nilekani’s point of view that India should be the use case capital of AI, Mistry said that he has a slightly different perspective. “India must focus on building the fundamental AI capabilities because we don’t want to become dependent on someone else in the future, as data is one of the gold mines in AI,” he concluded.

The post TWO Hai No. 1! 🚀 appeared first on Analytics India Magazine.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments

Latest stories

You might also like...