Apple surprised everyone with its presence at AWS re:Invent 2024. During his keynote, AWS chief Matt Garman invited Benoit Dupin, Apple’s senior director of machine learning and AI, on stage to speak about how the company works with Amazon Web Services (AWS) and uses its servers to power its AI and machine learning features.
Dupin said that the partnership with AWS, which spans more than a decade, has been crucial in scaling Apple’s machine learning (ML) and artificial intelligence (AI) capabilities.
Dupin, who oversees machine learning, AI, and search infrastructure at Apple, detailed how the company’s AI-driven features, including Siri, iCloud Music, and Apple TV, rely heavily on AWS’s infrastructure. “AWS has consistently supported our dynamic needs at scale and globally,” Dupin said.
Apple has increasingly leveraged AWS’s solutions, including its Graviton and Inferentia chips, to boost efficiency and performance. Dupin revealed that Apple achieved a 40% efficiency gain by migrating from x86 to Graviton instances. Additionally, transitioning to Inferentia 2 for specific search-related tasks enabled the company to execute features twice as efficiently.
This year, Apple launched Apple Intelligence, which integrates AI-driven features across iPhone, iPad, and Mac. “Apple Intelligence is powered by our own large language models, diffusion models, and adapts on both devices and servers,” Dupin said. Key features include system-wide writing tools, notification summaries, and improvements to Siri, all developed with a focus on user privacy.
To support this innovation, Apple required scalable infrastructure for model training and deployment. Dupin said, “AWS services have been instrumental across virtually all phases of our AI and ML lifecycle,” including fine-tuning models and building adapters for deployment. Apple is also exploring AWS’s Trainium2 chips, with early evaluations suggesting a 50% improvement in pre-training efficiency.
“AWS expertise, guidance, and services have been critical in supporting our scale and growth,” Dupin said.
Previously, Apple revealed that it uses Google’s Tensor Processing Units (TPUs) instead of the industry-standard NVIDIA GPUs for training its AI models. This information was disclosed in a technical paper published by Apple on Monday, outlining the company’s approach to developing its AI capabilities.
At AWS re:Invent 2024, Amazon Web Services (AWS) has announced the general availability of AWS Trainium2-powered Amazon Elastic Compute Cloud (EC2) instances. The new instances offer 30-40% better price performance than the previous generation of GPU-based EC2 instances.
The post Apple Intelligence is Nothing without AWS appeared first on Analytics India Magazine.