Siri to Finally Get its Apple Intelligence Upgrade Next Week

OpenAI and Apple have officially rolled out the ChatGPT integration for iPhones, which is now available for early developer access in the iOS 18.2 beta. This comes after the two heavyweights announced their partnership earlier this year. The slew of Apple Intelligence features, including its long-awaited ChatGPT integration, will eventually be available for iPhone 16, iPhone 15 Pro, and Pro Max.

The launch is scheduled to take place in Los Angeles on October 30th, with features expected to be fully available by 2025 through a phased rollout.

Early access for developers and the demos are already all over the internet. The Apple Intelligence features include: Genmoji, Magic Wand, and the much-awaited ChatGPT integration into Siri. People also compared Apple Visual Intelligence with Google.

Apple’s AI features mostly rely on models that can run on-device. That’s why there are certain requirements for running Apple Intelligence features. Only devices with at least an A17 Pro or M-series chipset and 8GB of memory are compatible with Apple Intelligence.

“It’s really wild how much they packed into the whole experience. Any one of these things would be a feature/product/startup on its own,” said a user on X. With iOS 18.2, Apple Intelligence now supports English for Australia, Canada, New Zealand, South Africa, and the UK, offering full features without switching to US English.

Introduction of the M4 Chip Backed by AI. Apple’s M4 chip, built exclusively for AI, boosts on-device privacy-focused processing power, enhancing tasks like image generation, language understanding, and Siri integration. iOS, macOS, iPadOS, ​iPad Pro and Macs will all have this chip.

Apple’s Approach to AI

According to Bloomberg’s Mark Gurman and an internal study, Apple is at least two years behind in the AI race with current incumbents. He also noted that OpenAI’s ChatGPT was 25% more accurate than Apple’s Siri, and able to answer 30% more questions.

The shift to AI is a big one, so Apple wants to get it right. “Apple’s point of view is let’s try to get each piece right and release it when it is ready. This isn’t a one-and-done kind of situation, especially with Apple Intelligence,” said Craig Federighi, the head of software at Apple, in an interview about how this is a decades-long arc of this technology playing out, with Apple going for a responsibility-first approach.

Even Tim Cook reinforced this by saying that Apple’s goal was to be the best. “We would rather come out with that kind of product and that kind of contribution to people versus running to get something out first. If we can do both, that’s fantastic. But if we can only do one, there’s no doubt around here. If you talk to 100 people, 100 of them will tell you: It’s about being the best,” he said in an interview with The Wall Street Journal.

What’s New with Apple Intelligence?

“The real power of intelligence is the one that understands you,’’ said Federighi about how Apple aims to bring in personalised features while also addressing privacy concerns.

Interestingly, unlike other players in this space, Apple runs its AI model directly on devices or through its private, end-to-end encrypted cloud. Federighi maintains that building such a model is challenging, which is why not all cloud computing operates this way.

In June, Apple introduced its on-device and server foundation models. Apple Intelligence’s architecture is built on a Transformer-based model. The on-device model, containing ~3 billion parameters, uses quantization techniques to optimise speed and memory, while the larger, server-based model runs on Private Cloud Compute powered by Apple silicon for resource-intensive tasks.

New Siri will be rolled out in phases. Siri, Apple’s Intelligence Assistant, was launched 13 years ago. Apple claims that it processes 1.5 billion requests every day. Siri now has a ‘glow light’ when active. As per Federighi, while Siri is not fully sentient yet, it is continuing to evolve.

More of OpenAI and Apple Collaborations

Both companies announced their one-of-a-kind partnership in June. Federighi still believes that Siri is more powerful than OpenAI Advanced Voice Mode, but is hopeful about both these tools converging in the future.

“The properties of something like OpenAI Advanced Voice Mode and Siri are quite different,” said Federighi, emphasising on the different use cases of both the tools. He said that while OpenAI’s Advanced Voice Mode can answer a question on quantum mechanics or write a poem about it, it won’t help you send a text message.

He added that Siri currently performs numerous useful tasks for users every day, efficiently and locally on their devices, but agreed that one day both tools could converge.

Per Apple, AI Cannot Reason Yet

Their research recently challenged LLM’s reasoning capabilities. The paper argued that models like GPT-4 and o1 perform sophisticated pattern matching rather than genuine logical reasoning. The researchers tested models on newly developed benchmarks, such as GSM-Symbolic, finding that models’ reasoning dropped by 30% when presented with irrelevant information, suggesting reliance on surface-level patterns.

The post Siri to Finally Get its Apple Intelligence Upgrade Next Week appeared first on AIM.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments

Latest stories

You might also like...