Whitepaper on the EU AI Act

AI without the cloud: How local hardware intelligence brings data sovereignty back to the company

Key takeaways at a glance
- On-device AI as a game changer: New processors (NPUs) make it possible to run powerful AI models locally. This significantly reduces dependency on US-based cloud providers.
- Privacy through technology: Local processing makes it easier to implement “privacy by design” (Art. 25 GDPR), as data transfers to third countries are minimized.
- EU AI Act applies: AI systems operated locally are also subject to regulation. SMEs must assess their device fleets for high-risk use cases.
- Shadow AI risk: AI functions deeply embedded in operating systems (e.g. Windows Copilot) require new internal policies and technical access controls.
- Management liability: As with NIS2 or the GDPR, company leadership remains responsible for AI governance and for monitoring the algorithms in use.
Introduction
Over the past two years, artificial intelligence (AI) has been inseparably linked to the cloud for most small and medium-sized enterprises (SMEs). Anyone who wanted to use ChatGPT, DeepL, or Midjourney had to send data to external servers—often located outside the EU. 2026 now marks a radical turning point: AI is moving directly into hardware.
So-called AI PCs and AI smartphones are no longer just marketing buzzwords. They are equipped with specialized processors, known as NPUs (Neural Processing Units), which are optimized to run complex neural networks directly on the local chip. For SMEs, this signals a shift from a “cloud-first” strategy to an “edge AI” approach.
What may initially look like a simple hardware upgrade is, on closer inspection, a fundamental transformation for corporate compliance. For the first time in years, companies have the opportunity to harness the power of AI without handing over data sovereignty to tech giants. However, this gain in autonomy comes with a new level of responsibility.
Table of Contents:
The technical deep dive: What's behind local AI?
To understand the legal and strategic implications, it is necessary to look at the technology behind this “deep integration.”
Until now, AI applications have worked much like a phone call: you ask a question, it is sent over the internet to a data center, processed there, and the response is sent back. With AI-integrated devices, this process happens internally.
- NPUs (Neural Processing Units): These specialized computing cores are designed to perform the mathematical operations of AI models (matrix multiplications) extremely efficiently and at high speed, without placing a load on the main CPU.
- Small Language Models (SLMs): While models like GPT-4 are massive, optimized models such as Llama 3, Mistral, or Microsoft’s Phi-3 deliver impressive performance with significantly lower memory requirements. They fit directly into the RAM of a modern laptop.
- Unified memory architecture: Modern chips (such as Apple’s M-series or Intel’s Core Ultra) allow AI to access data at lightning speed, enabling real-time features like live translation or automated video analysis.
Whitepaper on the EU AI Act
The GDPR perspective: A victory for data protection?
The EU AI Act: Hardware integration under the microscope
The new EU AI Act is formulated in a technology-neutral way. This means it does not matter whether the AI runs in the cloud or locally on the sales director’s laptop.
Risk classification
SMEs must assess which risk category their local use of AI falls into:
- Minimal risk: Local spam filters or webcam image enhancements. These applications are subject to very few requirements.
- Transparency obligations: AI systems that interact with people or generate content (e.g. local chatbots for customer service) must be clearly identified as such.
- High-risk AI: This is where things become critical for SMEs. If a company uses locally installed AI software to assess job applicants, evaluate creditworthiness, or monitor employees, it is subject to strict requirements for risk management, data quality, and human oversight.
Documentation is mandatory
Even for local systems, SMEs must maintain proper records. If AI functions are used for business purposes, they must be documented in the record of processing activities (RoPA) and in the documentation required under the AI Act. “I didn’t know my laptop did this automatically” will not be accepted by regulators.
Cybersecurity and the role of NPUs
Die Geschäftsführerhaftung: Ein Weckruf
Similar to the NIS2 Directive, responsibility for the secure use of technology is moving directly into the focus of company management.
Duties of care
Managing directors must ensure that:
- AI literacy exists within the organization. Leadership needs to understand the risks in order to make informed investment decisions.
- Monitoring mechanisms are in place. Companies must not blindly rely on the outputs of local AI systems (e.g. hallucinations).
- Resources for security are adequately provided.
In cases of gross negligence—for example, if it is known that employees are processing highly sensitive customer data with unvetted local AI tools and damage occurs—personal liability may arise. Implementing AI is not merely an IT project, but a strategic compliance responsibility.
Governance framework: How SMEs can safely introduce AI hardware
An uncontrolled rollout of AI PCs will inevitably lead to legal issues. A structured approach is therefore essential.
1. The “AI Acceptable Use Policy” (AUP)
- Create a binding set of rules for all employees. This should clearly define:
- Which AI functions of the operating system are permitted and which must be disabled
- Which categories of data may be processed with local AI
- How AI-generated results must be labeled and verified
2. Technical mobile device management (MDM)
Do you use IT systems to centrally control AI functions? This allows you, for example, to block features that automatically mirror data to the cloud (such as cloud synchronization of AI analyses) across the entire organization.
3. Adapting employment contracts and works agreements
Since AI integration often changes how work may be monitored or evaluated, early involvement of the works council (if applicable) and updates to internal policies on performance monitoring are necessary.
4. Vendor risk management (VRM)
Even if the AI runs locally, the software comes from vendors (Microsoft, Apple, Dell, Lenovo). SMEs must assess which telemetry data these hardware and software providers collect despite local processing.
Conclusion: Don't be afraid of hardware AI, but respect its complexity.
The growing adoption of devices with deep AI integration represents a historic opportunity for SMEs. It allows them to harness the efficiency gains of AI while avoiding the “data graveyard” of the cloud.
However, autonomy requires discipline. SMEs that jump on the AI PC bandwagon without adapting their compliance processes risk significant fines under the EU AI Act and the GDPR. Those that combine hardware innovation with professional compliance, on the other hand, create a real competitive advantage: digital resilience.
FAQ: Frequently asked questions about AI-integrated hardware
Do I need to replace all old laptops with AI PCs immediately?
No. However, for new purchases you should look for devices with an integrated NPU, as future business software (such as Microsoft Office 2026+) will run many features smoothly only with local hardware acceleration.
Is local AI automatically GDPR-compliant?
Not necessarily. The GDPR regulates not only where data is stored, but also the lawfulness of processing, purpose limitation, and data subject rights. Local AI also requires a legal basis (Art. 6 GDPR).
How can I tell if a device has deep AI integration?
Look for processor names such as “Intel Core Ultra,” “AMD Ryzen AI,” or “Snapdragon X Elite.” On Apple devices, all systems with M chips (M1 to M4) include an integrated Neural Engine.
Does the EU AI Act also apply to freelancers?
Yes. The regulation applies to all actors who place AI systems on the market or put them into operation in the EU, regardless of company size. However, the obligations scale with the risk level of the application.
Important: The content of this article is for informational purposes only and does not constitute legal advice. The information provided here is no substitute for personalized legal advice from a data protection officer or an attorney. We do not guarantee that the information provided is up to date, complete, or accurate. Any actions taken on the basis of the information contained in this article are at your own risk. We recommend that you always consult a data protection officer or an attorney with any legal questions or problems.



