Data Protection Made Simple
Our whitepaper shows you how to implement data protection requirements in practice, minimise risks, and ensure ongoing compliance – including checklists and best practices.

Digital sovereignty at risk? Why Facebook's plans to scan unpublished images are highly controversial in terms of data protection law

The most important facts at a glance
- Planned intervention: Meta wants to scan private, unpublished images locally on smartphones.
- Legal risks: Possible violations of GDPR and revDSG regarding transparency, purpose limitation, and consent.
- Digital sovereignty: Risk of losing control over personal data even without publishing it.
- Relevance to businesses: BYOD and app permissions increase the risk of data protection violations.
- Recommended measures: MDM systems, data encryption, separation of private/work areas, training.
- Strategic opportunity: Early action strengthens data protection compliance and business trust.
Background – Why Meta's plans are controversial
Meta wants to scan private, unpublished images directly on users' smartphones in the future – officially to detect abusive content at an early stage. Technically, the approach is similar to client-side scanning, but legally it poses considerable risks. Accessing personal content without active consent conflicts with key GDPR and revDSG principles such as transparency, purpose limitation, and voluntary consent. Particularly critical: In companies with bring-your-own-device policies, the boundaries between private and business data become blurred, which encourages data protection violations and reputational damage. Those who implement clear guidelines, technical protective measures, and awareness now will maintain digital sovereignty and legal security.
Table of Contents:
Meta (formerly Facebook) plans to access private, unpublished images directly on users' smartphones.
Officially, the project aims to combat child abuse by identifying relevant content at an early stage. However, in terms of data protection law, this constitutes a massive intrusion into informational self-determination. This development poses new challenges not only for private individuals, but also for companies, IT decision-makers, and data protection officers.
What Meta is planning
According to current information, Meta is developing technology that scans photos locally on mobile devices before they are published or shared with third parties. The technical approach is reminiscent of so-called client-side scanning technology. This involves analyzing content directly on the device, rather than after it has been uploaded to a server. Meta is reportedly pursuing this goal to detect abusive content at an early stage. Technically, this may be innovative, but from a legal and ethical perspective, the approach is highly problematic.
Legal basis: GDPR and revDSG
GDPR – Protection of personal data
The EU General Data Protection Regulation (GDPR) and Switzerland's revised Data Protection Act (revDSG) provide comprehensive protection for personal data. Meta's planned approach could violate several fundamental principles.
According to Article 5(1a) of the GDPR, personal data must be processed lawfully, transparently, and in a manner that is understandable to the data subjects. The very idea that an app scans private images without asking and without active user interaction contradicts this transparency requirement.
Another critical point is the purpose limitation requirement under Article 5(1)(b) of the GDPR: data may only be collected for specified, explicit, and legitimate purposes. When users save a photo on their device, they do not do so for the purpose of making it available for review by a provider. Nor is informed consent within the meaning of Article 7 GDPR usually given in such cases.
In addition, Article 25 GDPR requires companies to implement data protection through technology design (privacy by design). This includes, among other things, collecting only data that is necessary for the respective purpose. The systematic searching of all image data on a terminal device is contrary to this principle.
revDSG – Swiss data protection law
Similar principles apply in Switzerland. The revised Data Protection Act (revDSG), which came into force in September 2023, also emphasizes transparency, purpose limitation, and proportionality. Here, too, access to private content without clear consent and a legal basis is not permitted.
Digital sovereignty at the heart of the debate
What makes this project so controversial is the shift in control over personal data. When providers such as Meta determine which content may be analyzed without users actively initiating or consenting to this, individual control over one's own data is undermined. The fundamental right to informational self-determination is a valuable asset in the EU. It states that every person has the right to decide for themselves when and within what limits personal data is disclosed.
The idea that third parties can access private content that has never been shared fundamentally calls this principle into question. Even if the purpose—such as combating child abuse—seems legitimate, this must not lead to an erosion of constitutional principles. Otherwise, technological developments threaten to undermine fundamental rights instead of protecting them.
Why companies are affected
At first glance, it may seem as though this development only affects private individuals. However, access to end devices and private data also has far-reaching consequences for companies. In many organizations, Bring Your Own Device (BYOD) is a reality. Employees use private smartphones and tablets for work purposes. If applications are installed on these devices that independently and unsupervised access private data, new risks arise.
Under the GDPR, companies share responsibility for data processing on such devices, especially if company data is stored or processed. The risks range from data protection violations and data leaks to damage to reputation. Compliance officers should be aware that the boundaries between private and business data on mobile devices are becoming increasingly blurred.
Data Protection Made Simple
Our whitepaper shows you how to implement data protection requirements in practice, minimise risks, and ensure ongoing compliance – including checklists and best practices.
Focus on app permissions
One often underestimated risk is the access rights that apps receive when they are installed. Many applications request full access to the camera, microphone, or the entire storage area of a device. In practice, this means that an app could theoretically access all stored photos—even those that have never been shared with third parties. Even if access is technically local, it still constitutes an invasion of privacy.
It is therefore advisable for companies to formulate guidelines for app use on devices used for business purposes. A whitelist of approved applications and regular audits of installed software can help minimize risks.
Technical and organizational measures
Article 32 of the GDPR requires companies to implement appropriate technical and organizational measures to ensure a level of protection for personal data that is appropriate to the risk. In the context of mobile use, this means specifically:
- Use of mobile device management (MDM) systems to control app rights and access permissions
- Encryption of sensitive data on the device
- Separation of private and business data areas
- Training for employees on the secure use of mobile devices
These measures should not be seen merely as a mandatory exercise, but as a strategic investment in data security and the trust of employees and customers.
Consent must be voluntary
Another key element of the GDPR is consent. This must be voluntary, informed, specific, and unambiguous. If users effectively have no choice but to consent to data processing due to opaque terms and conditions or complex app settings, there is a lack of genuine voluntariness. Companies that use third-party software must therefore check whether the consent mechanisms of these tools meet the legal requirements.
Monitor regulatory developments
It is likely that Meta's initiative will provoke political and regulatory reactions. In the past, similar plans – such as those proposed by Apple – have been withdrawn following massive public criticism. However, technological developments are advancing, and with them the possibilities for automated analysis directly on the end device.
Companies should keep a close eye on these developments and figure out early on how new tech will affect their data protection strategy. Acting now can help you avoid risks and position yourself as a trustworthy player who's on top of data protection.
Conclusion: Act now, before others do
Meta's plans mark a new level of technological control over personal data. For companies, this is a wake-up call to rethink their data protection strategies. Those who establish clear rules, implement technical protective measures, and raise awareness among employees today will be better prepared tomorrow—legally, ethically, and strategically. Data protection is not only an obligation, but also a competitive advantage. Companies that take digital sovereignty seriously will secure the trust of their customers and partners in the future.
FAQ: Questions and answers for practical use
Can apps access unpublished images?
Technically, yes, if the app receives the appropriate permissions. Legally, however, this is only permissible if informed and voluntary consent has been given. Without this, access to private content is generally not permitted.
How can companies protect themselves?
Through clear guidelines on app use, the use of MDM systems, and regular training for employees. It is also important to raise awareness of data protection-compliant behavior when using mobile devices.
What role does consent play in such technologies?
A central one. Without valid consent, there is no legal basis for data processing. Companies should carefully check how third-party apps obtain consent and whether this complies with legal requirements.
How do the GDPR and revDSG differ in this case?
Both laws follow similar principles such as transparency, proportionality, and purpose limitation. The GDPR is more specific in many respects, while the revDSG offers a little more room for interpretation. In practice, however, both lead to the same result: secret or involuntary data analysis is not permitted.
Important: The content of this article is for informational purposes only and does not constitute legal advice. The information provided here is no substitute for personalized legal advice from a data protection officer or an attorney. We do not guarantee that the information provided is up to date, complete, or accurate. Any actions taken on the basis of the information contained in this article are at your own risk. We recommend that you always consult a data protection officer or an attorney with any legal questions or problems.


