MKBHD Gets Gaslighted by Rabbit R1

After ripping Hume Ai Pin apart, MKBHD (Marques Brownlee) now gets gaslighted by Rabbit R1. “My favorite new AI feature: gaslighting,” he quipped.

Jesse Lyu, the founder and CEO of Rabbit, quickly clarified and said: “we are aware of this bug. This is related to the time zone bug and it’ll get fixed via OTA before next Tuesday,battery performance improvements too.”

My favorite new AI feature: gaslighting pic.twitter.com/Yb43gB3EkA

— Marques Brownlee (@MKBHD) April 26, 2024

MKBHD’s recent interaction with Rabbit r1 says a lot about how users interact with AI systems. It goes beyond mere product or gadget reviews, bringing us to question whether AI has emotions and can even control users to theextent of gaslighting someone.

“…it’s just an AI that doesn’t actually have feelings but maybe it makes you think it does,” said Alan Cowen, the founder and CEO of Hume AI, in a recent interview on Every.

Enters Empathetic AI

“I think understanding people’s emotional reactions is really key to learning how to satisfy people’s preferences,” said Cowen, introducing the world’s first empathetic AI, EVI.

“If you’re confused it can clarify things and if you’re excited it can kind of build on that excitement and if you’re frustrated it can be conciliatory,” said Cowen.

Most of the time, it also comes down to user experience and how AI systems interact with its users. “In a customer service call we can predict when somebody’s having a good customer service call… with like 99% accuracy sometimes depending on the context versus with language alone it’s like 80%,” added Cowen.

Founded in 2021, Hume AI is a research lab and technology company founded by a former researcher at U.C. Berkeley and Google, Cowen. The company is on a mission to ensure that AI is built to serve human goals and emotional well-being.

Cowen believes that voice interfaces will soon be the default way we interact with AI. He said that speech is 4x faster than typing; and it frees up the eyes and hands; and carries more information in its tune, rhythm, and timbre.

“That’s why we built the first AI with emotional intelligence to understand the voice beyond words. Based on your voice, it can better predict when to speak, what to say, and how to say it,” he added.

Recently, the company raised a $50 million Series B funding from EQT Group, Union Square Ventures, Nat Friedman, Daniel Gross, Northwell Holdings, Comcast Ventures, LG Technology Ventures, and Metaplanet.

AIM also tested out EVI, and there is nothing like it.

EVI API is Finally Here!

The company recently unveiled the Empathic Voice Interface (EVI) API, marking the debut of the first emotionally intelligent voice AI API. EVI is now available, offering the ability to receive live audio input and provide both generated audio and transcripts enriched with indicators of vocal expression.

With 100K conversations at a 10-minute average conversation length and 3 million user messages, EVI introduces innovative features, including discerning appropriate speaking times and crafting empathetic language with precisely the right tone.

The team said that EVI can be configured as per customer requirements, saying that it now comes with the ability to alter personality, response style, and speech content. The platform also supports Fireworks Mixtral8x7b, alongside OpenAI, and Anthropic models.

In addition to this, users can connect to their WebSocket to build their own text generation server to determine all EVI messages in a conversation. They can also use EVI’s voice by sending texts to be spoken to their APIs.

“Our AI’s strength lies in empowering others through its toolset. Our API is key; it enables users to tailor their experiences and integrate basic tools like web search. It’s about enabling customisation and fostering collaboration, with developers building upon our interface and incorporating user personalisations,” said Cowen.

What’s Next?

Many experts believe that AI systems that understand emotional intelligence are the future. Hume AI is perfectly positioned to revolutionize how users interact with AI systems.

“In the future, you’re going to want to be able to talk to it in crowded places, and you’re also going to want to have it understand your tone of voice in addition to your facial expression so it knows when you’re done speaking and how you’re feeling,” said Cowin, talking about enabling seamless multimodal interaction.

Further, he emphasized the importance of personalisation in AI communication tools to make them more adaptable and personable. This is crucial for applications where AI interacts directly with users, such as customer service, therapy, or educational tools.

“I think customizing the voice is really important, and the personalities, a lot of it you can do with the prompt obviously; you can’t change the underlying accents and voice quality of the voice, so we’re adding more voices too.

The post MKBHD Gets Gaslighted by Rabbit R1 appeared first on Analytics India Magazine.

Follow us on Twitter, Facebook
1 1 vote
Article Rating
Subscribe
Notify of
guest
0 comments
Inline Feedbacks
View all comments

Latest stories

You might also like...