Losing Touch with the Metaverse, Meta Turns to Robotics

Meta Embodied AI

Earlier this year, when Figure CEO Brett Adcock referred to 2024 as the year of embodied AI, few could predict the extraordinary advancements in robotics, pushing the boundaries of what once seemed implausible.

Last week, an unexpected contender released research updates that advance robotics to a new level. Meta’s Fundamental AI Research (FAIR) released three new research artefacts that advance touch perception, robot dexterity and human-robot interaction, namely Meta Sparsh, Meta Digit 360, and Meta Digit Plexus.

Source: Meta

Meta Sparsh, derived from the Sanskrit word for ‘touch’, is the first general-purpose encoder for vision-based tactile sensing. The technology aims to integrate robots with the sense of touch, thereby addressing a crucial modality to interact with the world.

Sparsh operates across various types of vision-based tactile sensors and tasks that use self-supervised learning, eliminating the need for labelled data. It consists of a family of models pre-trained on an extensive dataset of over 4,60,000 tactile images.

Meta has also released Meta Digit 360, a tactile fingertip featuring human-level multimodal sensing capabilities. Next up is Meta Digit Plexus, a platform paired with Digit 360, that integrates various tactile sensors into a single robotic arm.

Meta FAIR mentioned in their blog that these new artefacts “advance robotics and support Meta’s goal of reaching advanced machine intelligence (AMI)”. AMI, also referred to as autonomous machine intelligence, is an innovation of Meta’s AI chief scientist Yann LeCun. The technology is envisioned to help machines assist people in their daily lives.

LeCun has proposed a future where systems can understand cause and effect and model the physical world.

Interestingly, Meta’s advancements in robotics are intended to help the whole ecosystem of builders develop machines that understand the world. However, robotics is not new for the company.

Meta’s Robotics Dream Takes Shape

Meta’s robotics developments have largely revolved around Metaverse and AR/VR sets that leverage AI. Two years ago, at the ‘Meta AI: Inside the Lab’ event, the company highlighted AI and robotics developments that are central to creating the metaverse, largely to bring immersive virtual experiences. Features such as Builderbot, Universal Speech Translator, and others aim to enrich the metaverse experience.

Notably, the former head of Meta hardware, Caitlin Kalinowski, who headed the development of Orion’s augmented reality glasses at Meta, recently joined OpenAI to lead robotics and consumer hardware.

With the recent announcements on tactile sensing innovations, Meta seems to be taking robotics to the next level. Furthermore, by open-sourcing these new models, Meta is continuing its run of enabling individuals and companies to grow in the open-source community. In the process, they are also attempting to take on NVIDIA.

Open Source for Robotics

The graphics processing unit (GPU) giant has also been making significant strides in robotics. NVIDIA Omniverse and digital twins have been powering several domains, including automobile, semiconductor and healthcare. NVIDIA’s Project GR00T, which was released earlier this year, is a new foundation model that aids the development of humanoid robots.

Just last week the company released two updates in robotics. NVIDIA, along with researchers from the University of California, Berkeley, Carnegie Mellon University, and other universities, released HOVER (humanoid versatile controller), a 1.5-million-parameter neural network, to control the body of a humanoid robot. The model is said to improve efficiency and flexibility for humanoid applications.

To further accelerate robotics developments, NVIDIA even released DexMimic Gen, a large-scale synthetic data generator that allows humanoids to learn complex skills from very few human demonstrations. This effectively reduces the time required for training robots, considering real-world data collection is one of the biggest hurdles in the humanoid development process.

“At NVIDIA, we believe the majority of high-quality tokens for robot foundation models will come from simulation,” said Jim Fan, senior research manager and lead of Embodied AI at NVIDIA.

With these many advancements, it is evident that more companies are increasingly expressing interest in robotics. OpenAI is seemingly positioning itself for the future with the recent appointment of a new leader from Meta. It won’t be too surprising if, tomorrow, Meta or OpenAI release a robot that embodies all senses.

Now that robots can hear, see, think, move, and touch, the only thing left is smell. Considering how a company is already building technology to teleport smell, it won’t be a surprise if robots get equipped with this capability in the future!

The post Losing Touch with the Metaverse, Meta Turns to Robotics appeared first on Analytics India Magazine.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments

Latest stories

You might also like...