EU consumer group calls for ‘urgent investigations’ of generative AI risks


EU flag superimposed on a circuit board

Consumer groups in Europe are urging authorities to protect consumers against the risks of generative AI, like ChatGPT.

On Tuesday, the European Consumer Organization (BEUC), representing consumer groups from 13 European countries, published a call to action, citing generative AI's ability to spread disinformation, entrench bias and discrimination, and create scams.

The BEUC's statement comes on the heels of the European Union's approval of the AI Act, which seeks to define and classify various forms of AI and their risks, so they can be regulated accordingly.

If passed, the EU AI Act would be the world's first collection of laws directly targeting AI. But the BEUC believes European consumers need protection from existing laws in the meantime.

"We call on safety, data and consumer protection authorities to start investigations now and not wait idly for all kinds of consumer harm to have happened before they take action," said Ursula Pachl, Deputy Director General of the BEUC. "These laws apply to all products and services, be they AI-powered or not and authorities must enforce them."

The announcement coincides with the publication of a report from BEUC member Forbrukerrådet, called Ghost in the Machine: Addressing the consumer harms of generative AI. The report outlines the harms generative AI could inflict on consumers including, concentration of Big Tech power, the creation and proliferation of deepfakes, bias in training data, privacy risks, job replacement from automation, and environmental impact.

The EU has been a global leader in enforcing digital protections for consumers and cultivating a business-friendly environment. In 2020, it passed the Digital Markets Act, aimed at tackling Big Tech gatekeeping and giving users more choice over the technologies they can use. Addressing the EU AI Act, Pachl said, "It's crucial that the EU makes this law as watertight as possible to protect consumers."