According to foreign media outlet The Verge (The Verge, translation dated August 28, 2025), Anthropic is introducing changes to its data processing policy: Claude users are asked to decide by September 28 whether they will allow their new or resumed conversations and coding sessions to be used for training AI models.


Essence of the Changes

  • Data retention period has been extended to 5 years with user consent, compared to the previous 30 days (The Verge, anthropic.com).
  • This applies to Claude Free, Pro, and Max users, including the use of Claude Code. Corporate clients (Claude for Work, Gov, Education, API) are not affected (The Verge, anthropic.com).
  • New users will make their choice during registration, while existing users will see a pop-up notification featuring a large “Accept” button and a small toggle switch that is enabled by default. This creates an “opt-out” mechanism, where users must actively refuse participation if they don’t want to be included (The Verge, anthropic.com).

Stated Motives and Possible Company Goals

Anthropic claims that the changes are aimed at improving model safety and enhancing capabilities in coding, analysis, and reasoning (anthropic.com).

However, analysts point out the practical aspect: access to a vast volume of “real” conversations helps Anthropic strengthen its competitive position against OpenAI, Google, and other market players.


Challenges and Risks for Users

  • Privacy: the automatic activation of training with a toggle enabled by default raises concerns, as many users may unintentionally consent (The Verge, anthropic.com).
  • Control over data: while users can change their preference in settings, any data already used for training cannot be removed from the model.

International Context

A similar situation was recently experienced by OpenAI: the company was legally required to retain ChatGPT conversations permanently, even if users deleted them. Such data retention has drawn regulatory criticism — in the U.S., for example, the FTC has warned that hidden or unclear changes to privacy policies could lead to enforcement actions.


Conclusion

Claude users face a difficult choice: consent and contribute to technological progress at the expense of their privacy, or refuse and retain control over their data. In this context, it is crucial that companies maintain a balance between innovation and data protection, while users remain well-informed about such changes.


Sources


Disclaimer:
This material is intended for analytical and informational purposes only. All information used in preparing this article was obtained from publicly available sources, with references provided in the text. The editorial board is not responsible for possible changes or clarifications made after publication. The views and assessments expressed reflect the author’s position and do not necessarily coincide with the official stance of the editorial board.

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *