- Horizon AI
- Posts
- Why Apple Keeps Failing at Artificial Intelligence π³οΈ
Why Apple Keeps Failing at Artificial Intelligence π³οΈ
How OpenAI Uses ChatGPT to Catch 'Leakers' π

Welcome to another edition of Horizon AI,
Some people expected Apple to lead the modern AI wave, but the reality has been far different: Apple's AI development has lagged significantly behind competitors. In today's issue, we take a look at how Apple's long-standing culture and strategy shaped this moment where they need to rely on one of their biggest competitors for their AI efforts.
Letβs jump right in!
Read Time: 4.5β min
Here's what's new today in the Horizon AI
Chart of the week: How people use ChatGPT
How OpenAI Tracks Down Internal Leakers Using ChatGPT
AI Findings/Resources
AI tools to check out
Video of the week
TOGETHER WITH WISPR FLOW
Vibe code with your voice
Vibe code by voice. Wispr Flow lets you dictate prompts, PRDs, bug reproductions, and code review notes directly in Cursor, Warp, or your editor of choice. Speak instructions and Flow will auto-tag file names, preserve variable names and inline identifiers, and format lists and steps for immediate pasting into GitHub, Jira, or Docs. That means less retyping, fewer copy and paste errors, and faster triage. Use voice to dictate prompts and directions inside Cursor or Warp and get developer-ready text with file name recognition and variable recognition built in. For deeper context and examples, see our Vibe Coding article on wisprflow.ai. Try Wispr Flow for engineers.
Chart of the week
How people use ChatGPT

Writing and practical guidance dominate ChatGPT usage, with editing text, personal communication, tutoring, and how-to advice making up the largest share of conversations.
Information seeking is another major use, with many users turning to ChatGPT for specific facts, explanations, and product-related questions.
AI News
OPENAI
How OpenAI Tracks Down Internal Leakers Using ChatGPT

OpenAI is reportedly deploying an internal version of ChatGPT to investigate potential information leaks, according to The Information.
Details:
When a news article about internal operations surfaces, OpenAI's security team feeds the text into a customized ChatGPT system that has access to internal files as well as employee Slack and email communications.
The system then suggests possible sources of the leak by identifying files or communication channels that contain the published information and highlighting which employees had access to them.
It is unclear whether this process has directly led to OpenAI actually catching anyone yet.
OpenAI has fired researchers in the past for allegedly leaking internal information outside the company; this effort would be yet another example of how highly the company prioritizes confidentiality.
AI Findings/Resources
βοΈ Reddit user asks AI to redesign their ugly apartment kitchen, then actually does it in real life
π ChatGPT's response to someone faking a quicksand emergency
βοΈ Someone built a flight simulator in under an hour using Claude Opus 4.6 and GPT-5.3
π AI turned Breaking Bad into Helium Balloon
π A report that uncovers what defined AI in 2025
AI Tools to check out
π ChatGPT Translate: Translate across 40+ languages with accuracy, tone, and cultural nuance.
π Monaco: It automates customer acquisition and revenue growth for startups.
β‘ Datastripes: Turn your data into visual stories and podcasts in seconds.
π¦Ύ Rowboat: Open-source AI coworker that turns work into a knowledge graph and acts on it.
π Agent Skills Marketplace: Browse 200000+ agent skills compatible with Claude Code, Codex CLI and ChatGPT.
Video of the week
Apple Has Failed At Artificial Intelligence. Again.
After two years of trying to develop proprietary LLMs to compete with ChatGPT, Apple has entered into a $1 billion-a-year partnership with Google. This collaboration means the next generation of Apple Foundation models will be based on Google's Gemini technology. This move is seen as an admission that Apple couldn't catch up to the "six-year head start" held by competitors like OpenAI and Google.
A Culture of Control vs. Innovation ποΈ
There are several core "weaknesses" that contributed to this failure:
The Siri Stagnation: Siri has used the same rule-based architecture for 15 years, requiring engineers to program rigid logic paths for every possible question.
Secrecy Over Collaboration: While the rest of the AI world thrives on open-source research and sharing (like Google's "Attention is all you need" paper), Apple built its AI in total secrecy. This led to a massive "talent drain" as top researchers preferred to work where they could publish their findings.
Stingy Infrastructure: Apple reportedly lacked the hardware necessary for top-tier training, owning roughly 100,000 AI chips compared to Googleβs one million+.
Marketing to Shareholders, Not Users πΌ
"Apple Intelligence" is described as a vague marketing concept designed more to reassure panicking investors than to define a specific technology. While the announcement successfully boosted Apple's stock price by 30% by the end of 2024, the actual product delivery has been rocky. Early features like email summaries and "Genmoji" faced criticism for poor performance and heavy battery drain.
The Risks of Dependency π‘οΈ
History may be repeating itself. Apple previously relied on Google for Maps data until Google began withholding features for Android, forcing Apple to launch its own (initially disastrous) Maps service. By relying on Gemini for the "new and improved" Siri arriving in 2026/2027, Apple is once again putting its core user experience in the hands of its biggest competitor.
Thatβs a wrap!
We'd love to hear your thoughts on today's email!Your feedback helps us improve our content |
Not subscribed yet? Sign up here and send it to a colleague or friend!
See you in our next edition!
Gina π©π»βπ»

