AI is Just a Hollow Wrapper Around Humanity

At every level—Application > Framework > Infrastructure > Hardware—layers wrap around each other. OpenAI, Google, and Anthropic, for example, build their models on frameworks like PyTorch or TensorFlow, supported by hardware giants like NVIDIA, AMD and Intel, which in turn depend on manufacturers like TSMC.

This intricate web of dependencies is, at its core, just technology wrapped around technology, ultimately revealing that even the most sophisticated AI is but a sophisticated wrapper around the human experience.

This notion feels both unsettling and undeniably true.

Recently, Plotch.ai CEO Manoj Gupta shared this perspective on LinkedIn, emphasising that as we layer technology over existing structures, each layer ultimately wraps around the previous one, adding function but concealing the foundational core.

“ONDC is a wrapper around apps. Agentic Mesh can be a wrapper around ONDC,” Gupta explained, shifting commerce from rigid API calls to fluid, conversational commands.

Wrappers, Everywhere

“Wrappers are at all levels; it’s just that they have given you so much value that you do not care,” said Perplexity CEO Aravind Srinivas commenting on the nature of wrappers. The term ‘wrappers’ first came into use in 2023 after the boom of GenAI startups, which were built on existing LLMs, causing a lot of uproar, especially in the VC ecosystem.

In a recent podcast with AIM, Vishnu Ramesh, the CEO of subtl.ai, recalled a memorable interaction with a venture capitalist along similar lines, who suggested he close his four-year-old entity to start afresh, fearing that the age bias might hinder seed funding.

“There’s so much history in what we’ve built. I openly told him, ‘I am not going to do that’,” Ramesh shared, reflecting a founder’s resilience and the challenges posed by VC expectations.

Arjun Rao, founding partner at Speciale Invest, highlighted a common VC perspective, where funding decisions tend to lean in favour of less risky and established ideas. “We want innovation at the heart of the product… but we also think in terms of risk,” he said, emphasising the cautious approach of VCs when dealing with high-cost, high-risk foundational AI projects.

Meanwhile, Abhishek Upperwal from Soket Labs noted that conveying deep technical concepts, like efficiency gained through custom CUDA kernels, is challenging when VCs lack the technical depth. “We hit Level 1 in our explanations, but if they show interest, we dive deeper,” he added, underscoring the complexity of discussing technical advancements with investors.

However, as AI scales, startups and their models should adapt to these new changes.

“There is one strategy which is to assume the model is not going to get better and then you build all these little things on it and there is another strategy which is OpenAI’s models are going to get better in the future. It would seem to me that 95% of the world has been built in the former category,” said OpanAI CEO Sam Altman in an interview reflecting on the conversation about thin wrappers.

Interestingly, Perplexity was in the talks to raise millions this week, and many debated whether companies should even be valued at such a whopping level. But with OpenAI integrating search features into its ChatGPT, some might argue that it puts the business model of Perplexity, built on both Anthropic and OpenAI’s models, into question.

Marc Andreessen wrote an essay that argues humans will never run out of problems, and he believes wrappers are useful in this context where they can help solve niche and specialised problems.

Case of India

At the end of the day, wrapper companies are going to take centre-stage than creating newer foundational models because building LLMs from scratch is an expensive, time consuming, and a capital intensive task.

Some believe it is not even necessary.

In a country like India, the most populous nation in the world, the approach to building AI differs significantly from that of the West. “If you look at how much an AI solution could mean to a user, it’s much higher in India with a much larger volume,” said Tanuj Bhojwani, head at People+ai, while discussing how India could benefit more from building on existing foundational models rather than developing indigenous models from scratch.

India’s CTO Nandan Nilekani also supports this line of thinking, while others like Bhavish Agarwal and Ajay Chaudhary are optimistic about building the next ‘NVIDIA’ in India and moving beyond the use case capabilities.

There is no stopping Indian founders. Many founders and developers build on existing LLMs creating value at the application layer. Healthify is an interesting case of point, which was featured in OpenAI’s recent DevDay, demonstrating real-time AI conversations in Hindi using OpenAI APIs.

Another one is Ola’s AI division using Llama models to create AI solutions specifically for the Indian market, adapting them to local languages and cultural needs. AIM has extensively covered how Indian AI companies, which were built on Llama’s open-source platform in India, delivered great value.

The post AI is Just a Hollow Wrapper Around Humanity appeared first on Analytics India Magazine.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments

Latest stories

You might also like...