Apple is poised to deliver a significant overhaul of Siri with the release of iOS 26.4 this spring. The update, initially delayed from iOS 18, represents a fundamental rebuild of the voice assistant, leveraging large language models (LLMs) to dramatically improve its capabilities. While not intended to function as a full-fledged chatbot like ChatGPT or Claude, the revamped Siri promises a more intuitive and contextually aware user experience.
From Task-Specific to Reasoning-Based
Currently, Siri operates by breaking down requests into a series of discrete steps. It identifies the user’s intent, extracts relevant information, and then utilizes various APIs and applications to fulfill the request. This fragmented approach limits its ability to handle complex tasks or understand nuanced phrasing. IOS 26.4 introduces an LLM core, shifting Siri from a system of keyword recognition to one capable of genuine understanding and reasoning. Instead of simply translating voice to text, Siri will aim to grasp the specifics of a user’s request and determine the best course of action.
Apple Intelligence Features Coming to Life
The improvements are centered around three key areas: personal context, onscreen awareness, and deeper app integration. Siri will gain the ability to understand references to emails, messages, files, and photos, allowing users to ask questions like “Show me the files Eric sent me last week” or “Find the email where Eric mentioned ice skating.” Onscreen awareness will enable Siri to interact with the user’s current screen content – for example, adding an address texted to a contact card or sending a photo the user is viewing. Finally, deeper app integration will allow Siri to perform more complex actions across multiple applications, such as moving files between apps or drafting and sending emails.
A Troubled Path to Improvement
The road to this update hasn’t been smooth. Apple initially aimed to deliver these improvements with iOS 18 but encountered significant technical challenges. An attempt to merge existing Siri systems with LLM-based functionality proved unsuccessful, necessitating a complete architectural overhaul. According to an internal communication revealed by an all-hands meeting in August 2025, the only viable path forward was to build a new architecture centered around a large language model.
Google’s Gemini Powers the New Siri
To accelerate development and overcome limitations with its in-house AI models, Apple has partnered with Google, integrating a custom version of the Gemini AI model into Siri. This collaboration allows Apple to leverage state-of-the-art LLM technology while continuing to develop its own AI capabilities. Apple intends to maintain user privacy by processing some features on-device and utilizing “Private Cloud Compute” to anonymize requests and allow users to disable AI features.
What Siri Won’t Be (Yet)
Despite the significant advancements, the iOS 26.4 version of Siri will not function as a traditional chatbot. It will lack the long-term memory and conversational capabilities found in models like ChatGPT and Claude. Apple is maintaining a voice-based interface with limited typing functionality for this iteration. However, further enhancements, including a chatbot-style interface, are planned for iOS 27.
Internal Restructuring and a Renewed Focus
The delays and challenges with Siri prompted a restructuring of Apple’s AI team. John Giannandrea, Apple’s head of AI, was removed from the Siri leadership team, with Mike Rockwell, previously the head of the Vision Pro project, taking over. This shift reflects a renewed commitment to AI development and a desire to accelerate progress. Tim Cook reportedly reassured employees that AI is a critical priority for the company, emphasizing the investment Apple is willing to make to become a leader in the field.
Launch Timeline and Compatibility
Apple has committed to releasing the updated Siri with iOS 26.4 in spring 2026. Testing is expected to begin in late February or early March, with a public release anticipated around April. While Apple hasn’t explicitly stated which devices will support the new Siri, it will likely be compatible with all devices that support Apple Intelligence features.
The iOS 26.4 update represents a crucial turning point for Siri. After a period of stagnation and a highly publicized delay, Apple is finally delivering on its promise of a more intelligent and capable voice assistant. While the initial release won’t be a complete transformation, it lays the groundwork for further advancements and positions Siri to better compete with leading AI assistants in the market.
