TECH NEWS – The Cupertino giant may be trying to pressure its employees into integrating AI more deeply into their workflows.
Global sourcing teams working inside Apple’s business development division have reportedly been given a daily token budget worth $300 for using Claude AI over the past few weeks. Approval for extra token requests increasingly depends on how heavily a team is using AI overall: Apple teams that significantly underuse their daily token budget are now more likely to see their additional requests denied. To put that daily allowance into perspective, Anthropic says Claude Code typically costs around $100 to $200 per developer each month on Sonnet 4.6. That strongly suggests Apple is seriously doubling down on AI usage across its internal workflows, with team productivity clearly in mind.
A friend at Apple told me that over the past couple weeks, their team got access to Claude with a $300/day token budget.
This is global sourcing on the business side, not engineering.
I’m also hearing that when directors ask for backfill, senior leadership is asking what the…
— Midnight Capital LLC (@Midnight_Captl) April 13, 2026
Apple’s AI strategy on the consumer side also appears to be nearing launch. The company’s revamped Siri chatbot is expected to run on Google’s own TPUs and cloud infrastructure, even though the service itself would remain Apple-owned. Apple insists that this arrangement would not alter its strict privacy guarantees. Bloomberg editor Mark Gurman previously claimed that the updated Siri chatbot would be woven into Apple’s software and gain the ability to use personal data, perform in-app actions, search the web, generate content – including images – offer coding help, summarize and analyze information, and upload files.
Apple is also said to be developing a feature that will let the Siri chatbot view already opened windows and on-screen content. It will apparently be able to modify device settings and functions as well, and handle more complex requests that combine multiple commands into a single instruction. Reports say Siri will use a much more advanced version of Google’s Gemini model, internally referred to as Apple Foundation Models 11. Gurman believes that model could be competitive with Gemini 3 and significantly more capable than the system currently powering the overhauled Siri experience.
Siri also will not remain limited to voice commands. Apple is reportedly preparing a dedicated Siri app for iOS 27, one that would serve as a central archive for previous conversations with the AI assistant. That app is expected to include an Extensions feature that connects directly with third-party agents such as OpenAI ChatGPT and Anthropic Claude, allowing Siri to tap into their capabilities. Apple is also expected to introduce a dedicated Extensions section inside the App Store, where users could install all supported third-party agents.
While users will still be able to activate Siri through voice commands or the power button, Apple is currently testing a new interface that would live inside the Dynamic Island. The ultimate goal, according to the report, is for Siri to replace Spotlight as Apple’s main search layer, creating a unified search experience. That new interface would still surface Siri suggestions spanning apps, upcoming meetings, and AI-recommended setting changes.
Source: WCCFTech, Intuition Labs




Leave a Reply