Compliance in PracticeData Protection

Alexa Now Sends All Recordings to Amazon - Is It GDPR Compliant?

Alexa listens to everything what does that mean for gdpr
252x252_arthur_heydata_882dfef0fd_c07468184b.webp
Arthur
22.07.2025

Main Points

  • Amazon removed the “Do Not Send Voice Recordings” option – all Alexa+ interactions are now sent to the cloud by default.
  • This raises serious GDPR concerns, especially around consent, data minimization, and the right to object.
  • Key risks include a lack of age verification and no meaningful opt-out for users.
  • Businesses building voice tech should prioritize privacy by design, granular consent options, and conduct DPIAs to ensure compliance.

In a controversial move, Amazon has updated its privacy policy for Alexa-enabled devices and with them, serious implications for GDPR compliance: as of March 28, 2025, Amazon disabled the “Do Not Send Voice Recordings” setting on Echo devices that support on-device storage. In other words, from now on, all Alexa interactions triggered by the wake word are sent to Amazon by default, with no way to opt out of cloud processing.

The shift coincides with the recent launch of Alexa+, Amazon’s new generative AI assistant designed to enable more advanced, personalized, and context-aware conversations.

Amazon claims this is necessary to power Alexa+ features like memory, improved comprehension, and contextual dialogue. But the change also means users lose control over whether their voice data is stored, analyzed, or potentially used to train AI models. This raises serious concerns about privacy, consent, and legality, especially under the GDPR.

For businesses building AI tools, voice assistants, or smart home products, this shift is a wake-up call about the importance of privacy-by-design, and a reminder that user trust and legal compliance are inseparable.

Table of Contents:

GDPR Concerns in Alexa’s New Recording Policy

From a GDPR standpoint, Amazon’s new Alexa policy raises several concerns.

While Alexa still requires the wake word to start recording, the removal of the opt-out option for sending recordings to the cloud changes the legal picture. Now, every triggered interaction is stored and processed off-device.

Let’s look at the main GDPR principles Amazon’s new approach may violate:

1. Consent

Under Articles 6 and 7 of the GDPR, data processing must be based on freely given, specific, informed, and unambiguous consent.

Amazon’s decision to remove the ability to opt out of cloud processing reduces user control. Users are no longer asked to agree to the collection and storage of their recordings; the feature is simply always turned on. The “Do Not Send Voice Recordings” setting is now replaced by the “Don’t save recordings” setting, which means your recordings will be sent to and processed in the cloud and then deleted after Alexa deals with the request. This undermines the requirement that consent be freely given and specific.

2. Right to Object

Additionally, under Article 21 of the GDPR, individuals have the right to object to the processing of their personal data when it’s based on legitimate interest rather than consent.

However, at the moment, there’s no mechanism to exercise this right. The only alternative is to stop using Alexa entirely or mute the device, which doesn’t meet GDPR standards for a meaningful opt-out. Users are effectively forced to accept the cloud processing of their voice data if they want to continue using the product.

3. Data Minimization

Under Article 5 of the GDPR, the principle of data minimization requires that only data necessary for a specified purpose be collected.

With Alexa+, Amazon is now sending all triggered voice data to the cloud by default. The company says this is required to support Alexa’s new generative AI capabilities, like memory and personalized responses. But GDPR asks a different question: is this level of data collection proportionate?

Sending all interactions to the cloud may be more than necessary for the stated purpose. Without clear technical justification or granular control, this practice may breach the minimization requirement.

4. Children’s Data

Children’s personal data requires special protection under the GDPR. In many EU countries, children under 16 cannot legally consent to data processing without parental approval.

Alexa devices are often used in households with children. If a child speaks after the wake word, Alexa will now record and send that data to Amazon by default, even if no adult has actively agreed to it. Amazon has not explained how it verifies whether a speaker is a minor, nor how it ensures valid parental consent.

This opens the door to the unintentional collection of children’s data without a proper legal basis, which could result in regulatory action. A similar issue has already surfaced in the USA - in 2023, the Federal Trade Commission fined Amazon $25 million for violating the Children’s Online Privacy Protection Act (COPPA) by retaining kids’ Alexa voice recordings indefinitely.

Flowchart illustrating Alexa+ voice processing steps – alexa gdpr.

Best Practices for GDPR-Compliant Voice and AI Products

Amazon isn’t the first company to face scrutiny over how it handles voice data. Google, Apple, and Meta have all drawn criticism for practices like accidental recordings, inadequate consent, or sending audio to contractors for review. These incidents have made it clear that voice technology and privacy must be designed hand in hand.

Amazon’s latest move is one of the clearest examples yet of how AI development can clash with data protection principles. It reflects a broader trend: as companies race to build smarter, more personalized AI assistants, they are increasingly willing to reduce user control in favor of model performance.

For businesses building voice-enabled tools, this shift is a warning - GDPR isn’t just a compliance requirement. Overstepping it invites regulatory action, reputational damage, and user backlash.

So what does responsible voice tech design look like in practice?

If your product captures, stores, or processes voice data, whether through smart speakers, AI transcription tools, or generative AI assistants, here’s how to reduce risk and stay compliant:

  1. Gain Explicit, Granular Consent - Users must understand exactly what they’re agreeing to. Offer opt-in choices for voice recording, and let users control how their data is used, whether for service delivery, product improvement, or AI training.
  2. Minimize What You Collect - Only capture the data you truly need. For example, avoid storing full audio when text transcripts will suffice.
  3. Make Privacy the Default  - Don't rely on users to hunt through settings to disable tracking. The default configuration should protect user privacy from the start, in line with GDPR’s “privacy by design and by default” principle.
  4. Be Clear About What Happens to Voice Data - In onboarding flows and privacy notices, state in simple terms what data is collected, why it’s collected, how long it’s stored, and who has access to it.
  5. Respect Data Subject Rights - Give users easy tools to access, delete, or export their data, especially for sensitive data like voice. Make these rights usable without friction, delay, or hidden steps.
  6. Consider Context and Third Parties - Voice assistants often operate in shared spaces such as offices, homes or public environments. Build features that prevent unintended data collection from guests or children. Use voiceprint recognition or contextual cues to limit access.
  7. Conduct DPIAs for Voice and AI Projects - A Data Protection Impact Assessment (DPIA) helps identify and mitigate privacy risks early. If your voice tech involves sensitive data, profiling, or automated decisions, a DPIA is not optional; it’s required.

GDPR compliance traffic light overview – alexa gdpr risks.

Conclusion

Amazon’s decision to have Alexa record everything by default may be framed as a technical improvement, but it signals a deeper tension between innovation and privacy.

Businesses that embrace the GDPR for responsible tech development will be better positioned to succeed long-term. Those who ignore it risk penalties and reputational damage.

At heyData, compliance is made easy for everyone, whether you’re building the next big AI product or simply getting your compliance foundations in place.

If you enjoyed this article, read more about top data breaches and privacy scandals of 2025.

FAQs about Alexa and GDPR

Can I stop Alexa from sending my recordings to Amazon?

No. As of March 28, 2025, Amazon removed the “Do Not Send Voice Recordings” setting. You can choose not to save your recordings after processing, but you can’t prevent them from being sent to Amazon’s cloud in the first place.

Is Alexa GDPR compliant after these changes?

That is currently under debate. Several GDPR principles, including consent, data minimization, and the right to object, may be undermined by the removal of the opt-out setting. Users now have limited control over how their voice data is processed.

What should businesses building voice tech do to stay GDPR compliant?

Businesses should prioritize privacy by design: offer granular consent options, limit data collection, provide clear privacy notices, and conduct Data Protection Impact Assessments (DPIAs). Compliance is not optional when handling sensitive voice data in the EU.

Why did Amazon remove the “Do Not Send Voice Recordings” setting?

Amazon says the change supports Alexa+, its new generative AI assistant. Features like memory and personalized conversations require cloud processing. As a result, recordings are now always sent to Amazon’s servers, even if users don’t want them stored.

Is cloud processing of voice data allowed under GDPR?

Cloud processing isn’t prohibited under GDPR, but it must meet requirements like lawfulness, fairness, transparency, and purpose limitation. If users can't refuse the processing and no clear legal basis like consent, is given, this may violate GDPR.

Important: The content of this article is for informational purposes only and does not constitute legal advice. The information provided here is no substitute for personalized legal advice from a data protection officer or an attorney. We do not guarantee that the information provided is up to date, complete, or accurate. Any actions taken on the basis of the information contained in this article are at your own risk. We recommend that you always consult a data protection officer or an attorney with any legal questions or problems.