Google has been quietly installing an app called SafetyCore on Android devices, which scans incoming and outgoing pictures to detect potentially explicit content. The feature, set to roll out in 2025, uses artificial intelligence (AI) to identify images that may contain nudity before sending or forwarding them.
SafetyCore is a system service launched in October 2024 as part of the Android Security Bulletin. While Google claims the app doesn’t collect or share user data, concerns have been raised about its potential for abuse and surveillance. Users can uninstall or disable SafetyCore through their device’s settings, but some worry that the feature may still be used to target certain individuals.
The Sensitive Content Warnings feature in Google Messages uses SafetyCore to detect potentially explicit content before sending. However, the exact data processing and storage practices remain unclear. Google has committed to following Play Families policy, which restricts data sharing with third-party companies, but some experts warn that AI-powered features like this may still pose a risk to user privacy.
As users debate the merits of SafetyCore, cybersecurity experts emphasize the importance of protecting device security beyond headlines. To safeguard against mobile threats, download Malwarebytes for iOS and Android today.
Source: https://www.malwarebytes.com/blog/news/2025/02/android-happy-to-check-your-nudes-before-you-forward-them