
Apple’s much-anticipated AI-powered Siri upgrade has been delayed, with security concerns reportedly playing a major role in the setback. Meanwhile, a recent blunder involving Apple’s AI transcription tool has raised further questions about the company’s ability to deliver reliable artificial intelligence features. Apple announced that its revamped Siri, powered by Apple Intelligence, is “taking longer than we thought” to develop.
While the company has not provided specifics, experts suggest security vulnerabilities, particularly prompt injection attacks, are a key issue. These attacks allow malicious actors to manipulate AI into executing unintended actions, posing a serious risk when Siri gains greater access to apps and user data. As Apple grapples with these security concerns, another AI-related mishap has highlighted the risks of unreliable machine learning.
A 66-year-old Scottish woman was shocked when Apple’s Visual Voicemail transcription system distorted a routine message from a car dealership, mistakenly suggesting she had “been able to have sex” and calling her a “piece of s***.” Experts believe that AI struggled with the speaker’s Scottish accent, raising concerns about the system’s training data and safeguards against offensive errors. The incident follows a pattern of AI-related missteps for Apple, including recent issues with voice-to-text transcriptions and AI-generated news summaries producing misinformation.
While Apple Intelligence promises a more advanced Siri, the company must first overcome significant hurdles in both security and accuracy before rolling out its AI features to the public..