New NowSecure Research Targets Mobile App Privacy Risks: What You Don’t See Is Hurting You

  • Home |
  • New NowSecure Research Targets Mobile App Privacy Risks: What You Don’t See Is Hurting You


Mobile applications power how we live and work, but behind the convenience lies a web of hidden data collection that puts users and enterprises at risk. 

Increased user profiling and real-time data harvesting (such as those implicated in the Gravy Analytics incident) has raised the stakes for protecting mobile app users from privacy leaks. Safeguarding privacy is not only critical to compliance, but also to protecting your customers, employees and brand. 

Our new research, detailed below, builds on the continuous mobile app security testing we perform for our customers and reveals just how widespread mobile privacy risks have become. To help organizations address this growing challenge, we’re introducing NowSecure Privacy, a new automated solution for mobile app privacy risk management.

Methodology

NowSecure researchers analyzed data from our NowSecure Mobile Application Risk Intelligence (MARI) service, which continuously assesses more than 4 million public mobile apps for security, compliance, safety and privacy risks. Important Note: Absolutely no customer-provided data has been included in this analysis, in any form (i.e. no anonymized or aggregated data).


Privacy is about transparency, compliance and trust.

Key points:

  • Mobile apps are a “privacy risk blind spot” for organizations: Out of 23,300 distinct iOS app packages tested in August 2025:
    • 35% of the iOS apps failed to disclose the data they collected.
    • 42% were missing their main privacy manifest (where developers provide transparency regarding privacy and data sharing to users), and 
    • 97% were missing required Privacy Manifests for their third-party SDKs
  • In the Android ecosystem, NowSecure also found that:
    • 10% of Android mobile apps don’t post a data safety section in the Google Play Store.
    • 40% of 10,500 mobile apps fail to declare that they can support a way to be forgotten, a common privacy obligation stipulated by leading regulations. 
  • AI is dramatically accelerating data analysis and extraction without permission or authorization, consent, knowledge: 
    • Of 183,000 mobile apps we scanned in 2025, 18% (33,396 apps) use artificial intelligence and 3,541 send data to AI endpoints which introduces privacy and security risks including sensitive data leakage and loss of IP. 
    • Why this matters: Most organizations have AI governance rules that require all AI usage and data sharing to be pre-approved and monitored for security and privacy purposes.
  • Mobile apps accumulate a massive amount of regulated data: 
    • Since August 2025, 75% of iOS apps tested and 70% of Android apps have both sensitive data and tracking domains. 
    • Mobile apps can obtain access, through dangerous permissions/entitlements, to your camera, microphone, geolocation, communication services, as well as sensor data and private files that collect vast amounts of sensitive private information. 
    • Why this matters: This can lead to data leakage, user tracking, and compliance risks. The presence of sensitive data and tracking domains create a strong implication that the apps are collecting sensitive private data and sharing this data with trackers. 
  • Mobile apps pose significant privacy risks due to hidden or unintended data collection and sharing, impacting users and enterprises.
    • In 50,000 apps NowSecure tested in August, over 77% were found to contain common forms of PII.

What is Driving Privacy Risk for Mobile Applications

Dangerous Permissions

Apple calls them dangerous permissions, and Google calls them dangerous entitlements, both grant access to private and sensitive information. Dangerous permissions/entitlements include access to storage, SMS, camera, microphone, photos, precise location, etc., often beyond what users expect or understand.

Mobile apps on both Android and iOS routinely request more permissions or entitlements than necessary, significantly expanding privacy risks for users and organizations. Elevated entitlements, can even exploit hidden system capabilities, compounding the threat of unauthorized data exposure and potential abuse by malicious actors or careless development practices.

Out of 378,000 Android apps tested (when), 62% request one or more dangerous permissions. More than 70% of those apps also transmit sensitive data while contacting tracking domains. Together, these behaviors significantly increase the risk of sensitive data leaking to third parties due to poorly managed permissions.

Dark Supply Chain

One distinct challenge arises from the widespread integration of third-party SDKs and APIs within mobile applications, rendering conventional assessment of only your own code and risk mitigation strategies insufficient.” If those aren’t tested, enterprises are still accountable for their risks”, warns NowSecure Cofounder Andrew Hoog. “Tested or not, when you ship that app, it has your brand name on it.” 

Selective Disclosure

Transparency is a major privacy concern that, in legal disputes, is a primary factor driving settlements, penalties and fines. Nearly 35% of all iOS apps tested fail to disclose data collection, and almost all omit required SDK manifests. This is a significant privacy blindspot. These findings should serve as a wake up call for app developers to ensure they build in the required transparency in their privacy disclosures, encompassing all third-party components like SDKs, APIs and AI.

Voracious AI

Mobile apps with AI capabilities are a new vector for collecting extensive device information in mobile applications. In a 2025 study, NowSecure identified that already 18% of 183,000 mobile apps use artificial intelligence in some form while 2% were observed to send data to remote AI endpoints. NowSecure also observed that some of these AI-enabled apps often transmit this data unencrypted to cloud-based AI services where it can be intercepted or accessed by unauthorized parties.

The AI models themselves pose additional risks through data leakage, where sensitive information used to train models can be accidentally exposed to other users through carefully crafted prompts that can trick AI systems into revealing private data or bypassing security controls.

Taken together, these “invisible” data flows create immense compliance and brand risk. In fact, a recent study found only 34% of businesses know how data is collected, stored and shared across their organizations underscoring that this remains a clear data governance challenge.

With 77% of apps containing common forms of PII including your location, where you work, your friends, your family and your accounts, the unavoidable conclusion is that many app developers don’t know what their mobile apps collect and share. Recent settlements, such as Google’s with the Texas AG ($1.375B) and Healthline’s under the California Consumer Privacy Act (CCPA) ($1B), illustrate this point and show regulators are paying close attention and taking substantial measures to enforce these privacy requirements.

Consumer-facing regulations like GDPR, CCPA, and others also grant rights to individuals as well. One of these rights is the right to be forgotten (or “erased”). During the last week in August and the first week in September, NowSecure found 40% of 10,500 Android apps don’t declare that they can support a way to be forgotten. While it’s possible some of these apps may support account deletion, without declaring it in Google Play, users still lack visibility and developers risk non-compliance with disclosure requirements. 

The Real World Impact of Mobile Privacy Risks

The data makes it clear that the level of excessive permissions, the amount of exposed PII, the gaps in transparency, and the accelerated adoption of AI, create  massive attack, leak and breach risks.  So it should be no surprise that this summer, three high-profile mobile app breaches made headlines: 

  • The SarangTrap malware campaign stole user data including contacts, private photos, SMS messages, devices and other sensitive data through fake dating mobile apps (over 250 fake apps) ultimately used by attackers to blackmail and extort some users. 
  • Sticking with the same genre, the legitimate and highly popular lifestyle apps, Tea Dating and TeaOnHer mobile app breaches exposed intimate images, private messages, and government IDs. In both cases, a class-action lawsuit has already been filed against the company for failing to properly secure and safeguard sensitive personally identifiable information.
  • In lawsuit news, the Patriots VPPA settlement highlights the growing legal risks mobile app developers face regarding user privacy, especially when handling sensitive data like video viewing histories and geolocation information. 

Under the settlement, the New England Patriots agreed to a substantial payout after allegations that their app shared personally identifiable information with third parties without proper consent. For developers, this underscores the urgent need to implement robust privacy practices, including transparent consent mechanisms and clear limitations on data sharing with third-party services or analytics providers. 

The case sets a precedent showing that even indirect identifiers and analytics transmissions can trigger significant liability under privacy laws. Ultimately, this settlement serves as a warning: failure to secure user data and comply with privacy regulations can result in costly litigation, financial penalties, and reputational damage for app publishers.

At the same time, new legislation continues to expand privacy obligations across jurisdictions, types of data, and processing activities. A small sample included below:

These developments underscore a critical truth: privacy in the mobile era is not just about security —it’s about transparency, compliance and trust.

Introducing NowSecure Privacy

NowSecure Privacy offers a pragmatic approach to mobile application risk management designed to help enterprises stay ahead of challenges by providing continuous, automated privacy testing and governance.  

  • Regulatory & Business Impact Analysis: Apps are tested based on business impact and regulatory requirements (GDPR, CCPA, COPPA, and HIPAA) and evolving laws concerning children’s privacy and new data types like neural data.
  • Standards Alignment: Apps are tested to OWASP MASVS privacy controls.
  • Automated Privacy Testing: Detects hidden data leaks, unsafe SDKs, excessive permissions and unapproved AI use.
  • Auditable paper trails: Track what data is collected, where it flows and who receives it. 
  • End-to-End Workflow Integration: Equips developers, AppSec and privacy teams to identify and remediate privacy leaks before they become breaches. 
  • SDK and AI Governance: Proactively assess and manage risk from external components (SDK’s, API’s, and AI) and data brokers).
  • Privacy +Security: Combine privacy and security testing necessary to stop breaches.

Building Trust Through Privacy

Consumers expect transparency. Regulators demand compliance. By automating privacy testing and governance, organizations gain trust, reduce liability and can position compliance as a competitive advantage.

The post New NowSecure Research Targets Mobile App Privacy Risks: What You Don’t See Is Hurting You appeared first on NowSecure.

Latest posts

Company News
Travis Norris

Why Choosing a SOC 2 Type 2 Certified Network Security Partner Matters

In today’s digital landscape, data protection isn’t optional—it’s essential. Businesses across every industry rely on third-party vendors to manage, store, and secure sensitive information. That’s why it’s critical to choose partners who can prove their commitment to safeguarding your data. One of the strongest indicators of that commitment is AICPA SOC 2 Type 2 certification.

Read More ⇾
Kofi's Korner
Kofi Kankam

Kofi’s Korner: July 2025 Edition

In this issue, we highlight and upcoming PC DSS DMARC implementation deadline (by March 31st), Microsoft Copilot updates, and the quickly approaching Windows 10 and Office Suite 2016/2019 End of Support date in October.  

Read More ⇾
Kofi's Korner
Kofi Kankam

Kofi’s Korner: March 2025 Edition

In this issue, we highlight and upcoming PC DSS DMARC implementation deadline (by March 31st), Microsoft Copilot updates, and the quickly approaching Windows 10 and Office Suite 2016/2019 End of Support date in October.    

Read More ⇾