Snap launched its ‘My AI’ for Snapchat in February this year as an experimental feature, stressing that it had been developed with safety in mind.
However, UK watchdog the Information Commissioner’s Office has now criticised the company on privacy grounds.
It has issued Snap with a ‘preliminary enforcement notice’ over what it describes as “potential failure to properly assess the privacy risks” posed by My AI. This follows an investigation by the watchdog.
“The provisional findings of our investigation suggest a worrying failure by Snap to adequately identify and assess the privacy risks to children and other users before launching ‘My AI’,” said the information commissioner John Edwards.
“We have been clear that organisations must consider the risks associated with AI, alongside the benefits. Today’s preliminary enforcement notice shows we will take action in order to protect UK consumers’ privacy rights.”
The preliminary notice means Snap now has a chance to respond to the ICO’s investigation, after which the regulator will decide whether to issue a ‘final enforcement notice’ – which could force Snap to stop offering the feature in the UK until it has carried out “an adequate risk assessment”.
Snap’s spokesperson told the Guardian that the company is reviewing the ICO’s preliminary decision. “Like the ICO we are committed to protecting the privacy of our users,” they said.
“In line with our standard approach to product development, My AI went through a robust legal and privacy review process before being made publicly available. We will continue to work constructively with the ICO to ensure they’re comfortable with our risk assessment procedures.”