How to protect your data in the face of the rise of artificial intelligence?

In the all-digital age and new technologies, shopping habits and consumption patterns are changing. But it is not without risks.

In the absence of opportunities to completely prevent them, early detection of computer attacks makes it possible to limit their costs. Shutterstock

Connected refrigerator, automated home light control, autonomous vehicles, drone deliveries, robot capable of answering all your questions and in several languages… If artificial intelligence (AI) is betting on t ‘to make life easier for customers and to meet their needs, is not without risks, especially for the security of their personal data. This is why Europe wants to complement its General Data Protection Regulation (GDPR) with a set of harmonized rules for the use of AI. A few days from European data protection dayon January 28, the European Consumer Center France explains the challenges and expectations of these texts in the face of the digitalization of consumption.

More and more connected and digitized consumption

Calculation of electricity consumption to provide suitable offers, connected watch that detects some pathologies from an abnormal gait or an excessively fast heart rate, Chatbot as customer services, remote program to turn on the heating at home.. .artificial intelligence has gradually invaded our consumption habits.
And this is just the beginning! Many companies are already working on technologies and business practices that use other artificial intelligences. Thus, drone deliveries, autonomous taxis, virtual reality marketing, voice bots are in development.

What are the risks to consumers?

All these new ways of consumption are not without risks for users. Since artificial intelligence involves many players (developers, suppliers, importers, distributors, users of AI), the system remains opaque to the consumer. It is therefore difficult to know who actually has access to personal data and who would be responsible in case of problems.
On the other hand, since the AI ​​system is programmed and automated, the risk of technical failure must be considered. And the consequences would be harmful. Examples: uncontrollable autonomous car, widespread power outage, false information or poor diagnosis…
Finally, the risk of leakage or loss of control over registered personal data is great: cyber-attacks, hacking, phishing or other targeted digital marketing techniques, fake news, fraud, etc.

European protection for the use of artificial intelligence

Faced with the growth but also the dangers of AI, Europe wants to strengthen its protective rules. In addition to the GDPR and the European Data Governance Law, the European Union has proposed three texts: a regulatory framework for artificial intelligence, an AI liability directive, a product liability directive.
In particular, Europe wants to stop the market and sanction “AI with unacceptable risks”. For example, those that would locate individuals remotely, in real time and in public spaces, with the aim of arresting or punishing them. It wants to assess and control “high-risk Artificial Intelligence” that is particularly related to the safety of a product (such as autonomous cars). And the EU wants to regulate “AI in acceptable risks” by forcing, for example, digital giants and other social platforms and networks to better inform users about their algorithms.
Like the GDPR, the fines provided for in these AI texts are significant: from 10 to 30 million euros or 2 to 4% of turnover in case of breach of obligations.
“Europe’s challenge now is to move quickly in adopting these texts, faster than the innovation and investment being made in artificial intelligence,” says Bianca Schulz, head of the European Consumer Center in France.
“Consumers are not always aware that asking personal questions, such as medical questions, in a chat tool gives the companies behind this artificial intelligence sensitive information that can be exploited for commercial purposes. That is why, in order to protect their data, consumers should always learn about the company that collects their data and its policy for processing this personal information,” concludes Bianca Schulz.

Leave a Comment