Why Everyone’s Going Gaga over TogetherAI

Earlier this week, San Francisco-based Together AI, an organisation dedicated to democratising AI through an open-source cloud platform, secured a $102.5 million Series A investment from NVIDIA, Kleiner Perkins, and Emergence Capital.

“Open source is the future of AI,” said founder and CEO Vipul Ved Prakash, adding that “it will be a major thrust of how most organisations implement generative AI.”

The latest funding has come after the company raised a $32.5 million seed round from Lux Capital and other investors, propelling its valuation to $200 million since January. The company aims to empower developers globally, offering them the tools to build and integrate AI models seamlessly into their applications.

“Together AI is well positioned to be the platform of choice as enterprises look to control their proprietary IP while pushing their generative AI investments from prototype into production,” said Joseph Floyd, GP of Emergence Capital.

What Works for Them

The market’s confidence in Together AI’s trajectory, anticipated to create approximately $10 million in revenue, is derived from a customer base of around 20, highlighting the substantial market traction it has gained.

Its flagship product, Forge, is a significant driving force behind Together AI’s revenue surge. Contributing 90% of the company’s total revenue, Forge offers startups a comprehensive solution encompassing computing resources and model training software within a single package. Launched in June, Forge promises access to servers housing Nvidia’s A100 and H100 chips at a remarkable 20% of the cost compared to renting them from major cloud service providers like Amazon Web Services. The platform’s cost-effectiveness positions it as a formidable competitor, potentially surpassing Nvidia’s own GPU cloud service.

The robust computing infrastructure is central to their prowess, scaling up to a staggering 20 exaflops across multiple data centres in the US and EU. Their cloud infrastructure, boasting NVIDIA GPUs and networking, in partnership with AI cloud leaders like Crusoe Cloud and Vultr, is meticulously tailored for high-performance AI applications. This bespoke infrastructure grants the company a competitive edge, offering superior economics for pre-training and inference workloads.

Additionally, as startups and enterprises increasingly seek a generative AI strategy that avoids reliance on a single vendor. A key pillar of Together AI’s success lies in its diverse repository of open-source models, hosting an array of models such as Llama 2, Stable Diffusion, and RedPajama—its own curated set of open-source models and datasets. RedPajama, in particular, has already garnered substantial traction on model hub Hugging Face, with its 2.8 billion parameter version downloaded nearly 20,000 times last month.

By providing customers with software and computing resources to run models, Together AI stands shoulder to shoulder with high-profile startups such as CoreWeave, Lambda Labs, and Foundry Technologies. This strategic positioning has garnered substantial investor attention, especially amid the ongoing GPU shortage, where startups collectively raise billions to address the demand.

Fueling Advancements Through Research.

Crucially, TogetherAI’s industry-leading performance foundation and reliability rests on its commitment to core research. Their groundbreaking endeavours, including the release of the RedPajama-V2 dataset, the largest open dataset comprising 30 trillion tokens for training Language Models (LLMs), highlight their dedication to fostering advancements within the AI ecosystem.

Tri Dao, Chief Scientist at Together AI, and collaborators unveiled FlashAttention v2, a pivotal innovation utilised by leading entities such as OpenAI, Anthropic, Meta, and Mistral in developing top-tier LLMs. Moreover, the company’s strides in inference techniques, embodied in Medusa—a framework for accelerating LLM generation and Flash-Decoding—for long-context inference have culminated in the fastest inference stack for transformer models available through the Together Inference API.

Another allure is the distinguished lineup of founders at Together AI—which further fortifies investor confidence. Comprising Percy Liang, a Stanford University professor heading its Center for Research on Foundation Models, Chris Ré, a co-founder of SambaNova Systems and Snorkel AI, and former Apple executive Vipul Ved Prakash serving as the CEO, the ensemble embodies expertise and vision in the AI realm.

The company’s mission to champion open and decentralised systems echoes the path of MosaicML, acquired for $1.3 billion by Databricks, signalling the potential valuation trajectory for Together AI, which, if mirrored, could reach a soaring $650 million valuation.

The post Why Everyone’s Going Gaga over TogetherAI appeared first on Analytics India Magazine.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments

Latest stories

You might also like...