Meta has been grabbing headlines, and not the good kind. In the recent turn of events, the company has done a little reshuffle, relocating their Responsible AI team to join the generative AI crew. The move comes off as misjudged given the series of unfortunate events that have unfolded at Meta in the past few months.
Their social platforms are causing all sorts of headaches – from tagging some Palestinian Instagram users with the label “terrorist” to WhatsApp’s AI whipping up wacky sticker with certain prompts, and even Instagram’s algorithms unintentionally helping folks stumble upon child sexual abuse materials. And now, they’ve done a little shuffle,
In the grand scheme of Meta’s big layoffs back in May, they pulled the plug on a fact-checking project that had taken a good six months to put together. Insider info suggests that it wasn’t the smoothest move.
David Harris, a senior product lead in charge of the Metaverse project, spilled the beans in a June blog post. He painted a picture of concern, wearing his AI researcher hat, and laid it out in The Guardian. “Sadly,” he said, “the civic integrity team I was part of got the boot in 2020, and with all these rounds of layoffs, I’m worried the company’s ability to combat these issues has taken a hit.”
This team’s been through the wringer before, with a reshuffle earlier this year that left the Responsible AI team looking more like a ghost of its former self, according to Business Insider. Reports even hinted that the team, born in 2019, had limited say-so and had to jump through hoops of stakeholder negotiations to get anything done. Not to mention, last September, Meta bid farewell to its Responsible Innovation Team, a group meant to tackle “potential harms to society.”
All About Generative AI
Meta’s decision to disband its Responsible AI team could be a bit of a tricky move. On the one hand, having a separate crew for Responsible AI means they can do their thing independently from the folks creating the tools they need to check.
But here’s the rub – these researchers tend to jump into action a tad too late. If they were in on the game early in the development process, they could catch some of the problems right out the gate.
Take, for instance, Meta’s brainchild, Galactica – a ChatGPT-like model for scientific research. It hit the scene, but three days later, it was lights out. Why? Well, turns out, it couldn’t tell the difference between truth and make-believe. That’s a bit of a hiccup for a language model meant to whip up scientific text. People discovered it was cooking up fake papers, throwing in real authors’ names, and even cranking out wiki articles on the interstellar history of bears.
As Meta keeps churning out these AI models, most of the Responsible AI team is making a move to join Meta’s generative AI squad. The company’s been slashing costs left and right, hitting up departments all over, including the responsible team. But Meta’s eyes are still locked on the prize – they’re all about that generative AI game.
Year of Inefficiency
At the start of 2023, Mark Zuckerberg, the head honcho at Meta, said “Our management theme for 2023 is the ‘Year of Efficiency’ and we’re focused on becoming a stronger and more nimble organisation,” as part of the release of Meta’s fourth-quarter earnings report.
But then things got a bit rocky. They started laying off a bunch of people, there was the famous LLaMA leak and now the splitting of people building AI responsibly.
For a long time, Meta was like a superhero in Silicon Valley when it came to safety and ethics. But in May, a bunch of people got the boot in a big company shake-up. Former trust and safety employees felt like their jobs were always on the line, and their managera didn’t always get how important their work was for Meta’s success.
Adding to the brouhaha, the recent big switch-up of teams dealing with safety and AI ethics shows how far companies are ready to go to keep Wall Street happy and meet their demands for efficiency.
The post Meta Wants to Build Generative AI But Not Responsibly appeared first on Analytics India Magazine.