What Apple is testing now
Apple is reportedly running an internal chatbot called Veritas as a prototype for next generation Siri features, according to a Bloomberg report. The app is available only to employees. It looks similar to conversational systems like ChatGPT and Google Gemini, and it lets staff try features such as searching personal data on a device and triggering in-app actions.
Key names and facts up front: the report comes from Bloomberg; the project is called Veritas; it is intended to test Siri upgrades that Apple has delayed; and Apple may continue to rely on Google’s Gemini for some AI search capabilities while it refines its own systems.
Why this matters to ordinary users
Siri is Apple’s built in voice assistant, and many iPhone and Mac users expect it to get smarter over time. If Veritas leads to consumer features, everyday tasks like finding messages, managing calendar events, composing drafts, and controlling apps by voice could become more conversational and context aware.
At the same time, Apple has been cautious about releasing AI features. Its recent Apple Intelligence update received mixed feedback, and some planned Siri upgrades were pushed back. Veritas shows Apple is actively prototyping, but for now the work is internal and not available to consumers.
What Veritas is and what it does
Veritas is an internal, ChatGPT style chatbot used by Apple employees to prototype voice assistant upgrades. Employees can:
- Converse with the bot to test natural language responses.
- Revisit previous exchanges to check memory or context handling.
- Test actions that access personal data on a device, like messages and photos, and perform in-app tasks.
The goal is to see how conversational AI could work with a person’s own data and with apps on Apple devices, while maintaining Apple’s privacy rules and product standards.
Why Apple is keeping Veritas internal
Apple appears to be limiting Veritas to staff during early development for several practical reasons.
- Control and safety. Internal testing gives Apple the ability to monitor behavior, correct problems, and stop features that produce errors before any public exposure.
- Privacy and compliance. Allowing a chatbot to access personal data raises complex privacy and security questions. Internal trials help Apple assess safeguards and data handling without exposing user information.
- Iterative improvement. Developers and product teams can try feature variants quickly, gather feedback, and refine language, instructions, and user experience without the pressure of public expectations.
The tradeoffs include slower public feedback and the risk of falling behind competitors who release consumer tests more widely.
Siri’s recent struggles and delays
Siri has faced expectations to become a more powerful conversational assistant for years. Apple promoted a major step called Apple Intelligence recently, but user reaction was mixed. Some features were delayed or received tepid reviews. Veritas appears to be one path Apple is using to address those gaps before a broader rollout.
That caution is partly why Apple has not yet delivered the kind of chat driven, generative experiences some rivals already ship. Internal testing can help avoid public missteps, but it can also slow the pace of visible progress.
Apple and Google: partnerships and dependencies
Bloomberg’s reporting indicates Apple might lean on Google’s Gemini for some AI powered search capabilities. Apple partnering with or licensing services from other companies is not unusual, especially when time to market or specialized capability is a factor.
Using an external model has pros and cons:
- Pros. Faster access to advanced language models, reduced development overhead, and improved baseline performance for search and comprehension tasks.
- Cons. Potential questions about data routing and control, and weaker brand differentiation if core features depend on competitors.
Privacy and data concerns
Allowing an AI assistant to search a user’s messages, photos, and other personal files raises privacy and security issues. Apple has long positioned itself as a privacy champion, so any system that touches personal data will be subject to extra scrutiny.
Key privacy considerations include:
- Where data is processed, locally on the device or on company servers.
- How long interactions and personal prompts are stored.
- Whether external models like Gemini see raw user data, or only transformed signals.
- How users can review, correct, and delete AI driven actions that involve their personal information.
Internal testing can help Apple design features that match its public privacy commitments, but it also means careful engineering choices must be made before a consumer release.
Product strategy and possible release scenarios
Apple has several ways it could take Veritas work to customers, each with different tradeoffs.
- Internal refinement then full consumer launch. Apple could continue internal testing until the product meets its standards, then release a broadly available Siri upgrade across devices.
- Staged rollout. A limited public beta could be offered to developers and opt in users to gather broader feedback while keeping control over data access and feature sets.
- Hybrid approach with partner models. Apple could integrate external models like Gemini for some features while keeping sensitive processing on device for privacy critical tasks.
Timelines are uncertain. The Bloomberg report suggests Apple is still in prototype phases, so a wide consumer release could be months away or longer. Apple will weigh reliability, safety, and privacy as it decides next steps.
What this means for everyday users
If Veritas leads to a consumer Siri upgrade, users can expect smarter conversations, better context retention, and more natural ways to perform tasks across apps. That could reduce friction when composing messages, scheduling, or finding information on a device.
At the same time, users should expect clear privacy controls. Apple will need to explain what data the assistant can access, how it is protected, and how to opt out of features that require access to personal content.
Broader industry impact
Apple’s internal approach highlights a split in the AI assistant market. Some companies push public releases and rapid iteration. Others take a slower, more controlled path that prioritizes privacy and product polish.
Apple’s choices will influence developers and device makers that integrate assistant technologies. If Apple opts for tighter privacy controls, partner ecosystems may need to adapt to new APIs and guardrails. If it leans on external models, the boundaries between companies may shift as they trade off speed for control.
Key takeaways
- Veritas is an employee only chatbot Apple uses to prototype new Siri features, including accessing device data and performing in-app actions.
- Apple is testing internally to control quality and address privacy concerns before any public release.
- Apple may rely on Google’s Gemini for some search related AI functions while developing its own models and interfaces.
- Any consumer rollout will depend on technical readiness, privacy architecture, and user controls.
FAQ
Will Veritas be available to consumers soon
Not immediately. Veritas is described as an internal test. Apple may take months to refine features and privacy safeguards before a public release or beta.
Is Apple replacing Siri with Veritas
No. Veritas is a prototype to help improve Siri. Siri remains the consumer assistant on Apple devices.
Does this mean Apple uses Google technology now
Bloomberg reports indicate Apple may use Google’s Gemini for certain search capabilities. That would be a selective partnership for specific functions, not a wholesale replacement of Apple technology.
Conclusion
Veritas shows Apple is actively experimenting with more conversational, generative features for Siri. The internal, employee only nature of the project reflects Apple’s cautious stance on privacy and reliability after mixed reception to some recent AI moves. For users, improved assistant functionality could make daily tasks easier, but it will come with questions about how personal data is used and protected. Apple’s next steps will affect not only Siri users, but also how the tech industry approaches assistant features and privacy trade offs.







Leave a comment