Google’s Palm-Saycan, a Way for Robots to Understand Human Interactions

PaLM

Unlike normal robots which are not so good at taking multiple instructions, Google’s bot performs tasks involving several steps

How can we define a human-like robotic experience when a robot is oblivious to the subtle language used in different contexts? Google seems to have cracked a major challenge so far for a service robot, ie., to understand context, cutting through formal language semantics. PaLM, Google’s LLM, will train its domestic robots in the subtleties of human language so that they grasp the nuances and think for themselves including in complex situations. Instructions as suggestive as “Bring me a snack and something to wash it down with” and, “I spilled my drink, can you help?” generally will go over the head of normal task-specific robots. It requires grasping the context and figuring out the task to perform, all while calculating the goal and breaking it into a certain number of steps, reasoning out the action required. When the command, “I just worked out. Can you bring me a drink and a snack to recover?” is given, PaLMs interpretation would be like, “The user has asked for a drink and a snack. I will bring a water bottle and an apple.” Google says just asking the robot in everyday language is enough for it to sense what you need it to do the way you do. “With PaLM, we`re seeing new capabilities emerge in the language domain such as reasoning via a chain of thought prompting. This allows us to see and improve how the model interprets the task,” said Google in a blog post.

As per Google’s statement, this is the first time a robot is equipped with a large-scale language model. When compared to previous software, PaLM-SayCan can improve the robot’s performance by 14%. Unlike normal robots which are not so good at taking multiple instructions, Google’s bot performs tasks involving several steps. Google, in collaboration with Everyday Robots, developed the PaLM-SayCan model implementing advances in technologies such as reinforcement learning, imitation learning, and learning from simulation. This particular case focuses on augmenting low-level skills with a language model whereas earlier models were confined to executing short skills.

The robot has a tubular white body with a grasping claw at the end of its arms and the cameras in place of eyes render it a human-like appearance but mostly its functions are robotic with context interpreting capability. To train PaLM (Pathways Language Model), Google used around 6,144 processor machines and a vast collection of GitHub documents which include multilingual web documents, books, Wikipedia articles, conversations, and programming code. PaLM is considered one of the near-sentient LLMs which can interpret jokes, and complete sentences and have its own chain of thoughts. Now, as Google combined PaLM with robotic motor skills, the output is a combination of the best language and robotic skills. “As we improve the language models, the robotic performance also improves,” said Karol Hausman, a senior research scientist at Google who helped demonstrate the technology. PaLM-SayCan is tested against a number of real-world robotic tasks, to train it for most of the real-world grounding to prove that this approach is capable of abstract language processing. Google though is not yet ready to release its robot into the market, competing with companies like Amazon, which is selling its home robots for a premium, and seems only to have research and development in mind. The swarm of Google’s robots moving around its floors at its Californian office is adequate proof. “You could imagine a ton of overlap between Google’s overarching mission and what we’re doing in terms of more concrete goals. I think we’re really at the level of providing capabilities and trying to understand what capabilities we can provide. It’s still a quest of ‘what are the things that the robot can do? And can we broaden our imagination about what’s possible?’”, says Google Research robotics lead Vincent Vanhoucke to a digital news portal.

More Trending Stories

Bitcoin at Crisis! Fails to Regain Traders’ Faith After Slipping Through US$24K

Google is Infusing LLM into Home Robots! Where is it Taking us?

Why AIOps Can Be Essential for Engineering in The Future

PyPi Python Packages are the New Source of Supply Chain Attacks

Zuckerberg’s Metaverse Avatar again Screams ‘Basic’! Gets Twitter Criticism

Tornado Cash has Made Normal DeFi Founders Vulnerable and Incapable

Top 10 Convolutional Neural Network Questions Asked in FAANG Interviews

If Constipation is What Bothers You, Make a Visit to this AI Doctor

The post Google’s Palm-Saycan, a Way for Robots to Understand Human Interactions appeared first on Analytics Insight.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments

Latest stories

You might also like...