ChatGPT returns to Italy after ban

[ad_1]

ChatGPT maker OpenAI has restored access to its service in Italy, saying it has implemented changes to satisfy Italian regulators. “ChatGPT is available again to our users in Italy,” it said in a statement published by The Associated Press and also sent to The Verge. “We are excited to welcome them back, and we remain dedicated to protecting their privacy.”

OpenAI said it had “addressed or clarified” the issues raised by the Italian Data Protection Authority (or GPDP) in late March. The GPDP accused ChatGPT of unlawfully collecting users’ data and failing to prevent underage users from accessing inappropriate material, leading OpenAI to block ChatGPT in the country. The company was given 20 days to address the issues, and regulators said in mid-April that ChatGPT could return if it did so by April 30th.

Among the changes, OpenAI linked The Verge to a new form that EU users can submit to remove personal data under Europe’s General Data Protection Regulation (GDPR). It also says that a new tool will verify users’ ages upon signup in Italy, and it published a help center article that outlines how OpenAI and ChatGPT collect personal information, including information about contacting its GDPR-mandated data protection officer.

The GPDP didn’t immediately respond to a request for comment from The Verge. In a statement published by AP, it said that it “welcomes the measures OpenAI implemented,” and it urged OpenAI to comply with further age verification changes and a publicity campaign to inform Italians about their right to opt out of data gathering.

So far, none of these changes seem to dramatically modify how ChatGPT operates in Italy. But OpenAI will almost certainly face further challenges. Spain, Canada, and other countries have opened or considered opening investigations into its practices, including how it collects training data for its large language model and what information that model produces for users. And European lawmakers are advancing the AI Act, which could add additional requirements for companies like OpenAI — potentially involving some significant new information disclosures.

[ad_2]

Source link