- Dr Arnt Glienke, LL.M.
- 20.05.25
- Reading time: 5 minutes
Microsoft Recall and the GDPR:Innovation with data protection risks
With the update to Windows 11, Microsoft introduced the ‘Recall’ function – a feature that at first glance looks like a practical tool for increasing productivity: Recall regularly saves screen recordings, analyses them using artificial intelligence and thus enables a time-sensitive full-text search through past activities.
What is technically impressive, however, raises considerable questions in terms of data protection law – particularly with regard to the requirements of the General Data Protection Regulation (GDPR). The use of recall harbours potentially serious risks, especially for companies.
What is Microsoft Recall?
Recall is part of the new Copilot+ PCs and allows you to ‘travel back in time’ through your computer activities. The system automatically creates screenshots every few seconds, saves them locally and processes them with the help of AI. This allows users to access content that was displayed on the screen at a certain point in time – whether it’s documents opened, websites visited or applications used.
Microsoft advertises Recall as a personal memory aid – but the benefits are in conflict with the requirements for the protection of personal data.
Classification under data protection law:These GDPR principles are affected
The GDPR sets out clear principles for the processing of personal data. In the case of Recall, several of these principles come to the fore:
- Purpose limitation (Art. 5 para. 1 lit. b GDPR):
Data may only be processed for specified, clear purposes. Recall stores information without a clear purpose in individual cases – a problem for legal categorisation. - Data minimisation (Art. 5 para. 1 lit. c GDPR):
Only necessary data should be collected. Permanent, comprehensive recording of the screen seems to run counter to this. - Storage limitation (Art. 5 para. 1 lit. e GDPR):
Data must be deleted as soon as it is no longer required for the purpose. Recall stores the information for up to three months – by default. - Transparency (Art. 12, 13 GDPR):
Users must know which data is processed and how. Despite active consent, many details remain unclear. - Data subject rights (Art. 15 et seq. GDPR):
Although Microsoft provides erasure and management tools, their scope and effectiveness have not yet been conclusively clarified.
Risks for users and companies
As impressive as the technical implementation of the ‘Recall’ function may be, its use brings with it considerable data protection and practical challenges. Depending on the context, these not only affect individual users, but also companies in particular, which are confronted with questions of compliance, data security and employee data protection.
While private users are primarily confronted with the risk of unintentional data disclosure, companies find themselves in a particularly delicate situation: as soon as Recall is used in a work environment, basic legal obligations are at stake – from the protection of personal data to labour law requirements. There is not only the threat of data protection violations, but also damage to image and conflicts of trust.
For private users:
- Sensitive data:
Passwords, health information, bank details or private communications can be stored unintentionally – without the user being aware of it. - Potential for misuse:
If the device is lost or compromised, there is a risk that extensive screen data could fall into unauthorised hands. - Feeling of surveillance:
The permanent, albeit local, recording can give users a subjective feeling of being observed and restricting their digital freedom.
For companies:
- Breach of confidentiality obligations:
Business and trade secrets, customer information or internal meeting content could be documented without the company’s knowledge. - Unauthorised employee monitoring:
If Recall is used in a corporate environment without clear guidelines, transparency and co-determination, there is a risk of conflicts under labour law. - Compliance risks:
The use of Recall can exceed a data protection relevance threshold – in these cases, a data protection impact assessment (DPIA) is mandatory in order to systematically record risks and protective measures.
Microsoft's reaction:Data protection features as a protective shield?
In the face of criticism from data protectionists and the public debate about potential surveillance, Microsoft has responded to data protection concerns – at least from a technical perspective. The company emphasises that Recall is designed for transparency and control and should leave users in control of their data.
In detail, Microsoft has integrated the following protection mechanisms:
- Recall is deactivated by default (opt-in).
- The stored data remains locally on the device.
- Encryption and connection to Windows Hello are designed to prevent unauthorised access.
- Private content such as passwords or incognito browser windows are excluded from recording.
- Users have the option of pausing Recall or deleting data.
These default settings are a step in the right direction – but they do not fully resolve the challenges posed by data protection law. In the corporate environment in particular, key questions remain unanswered regarding transparency, control options and legal admissibility. Companies should therefore not rely solely on Microsoft’s technical measures, but should develop their own protection concepts and clear usage guidelines.
Assessment of the data protection supervisory authority
European data protection authorities are also taking a critical view of Recall. The Bavarian State Office for Data Protection Supervision, for example, is calling for clear standards in the default settings, transparent user information and robust technical security. There has so far been a lack of a viable legal framework, particularly for corporate use.
Technological progress needs clear rules
Microsoft Recall is an example of how quickly innovations can challenge regulatory frameworks. The function offers potential – especially in the private sector, provided that users give their conscious consent and technical protective measures are in place.
For companies, on the other hand, it is better to be safe than sorry. Recall should not be used without clear guidelines, technical access restrictions and data protection assessments. A data protection impact assessment (DPIA) is essential in order to systematically record the risks and define suitable protective measures. Companies should only use Recall, if planned at all, after a careful legal and technical review – ideally with the involvement of the data protection officer and by defining clear terms of use for affected employees.
Microsoft, on the other hand, has a responsibility to think about data protection from the outset. Only if ‘privacy by design’ is more than just a buzzword can Recall really achieve the balancing act between innovation and the protection of fundamental rights.
Your personal contact
Matthias SchulzDirector Sales
- +49 40 257 660 967
- +49 40 257 660 919
- m.schulz@clarius.legal