Why it matters: In order to deliver the highly personalized results Apple promises, it needs the trust of users and the policies designed to earn it.
Catch-up quick: Tech companies -- Apple included -- don't have to say where they get the information used to train their models. But they do have to say how customer data is used.
Zoom in: Apple has been adding features to Apple Intelligence, with its most recent update adding ChatGPT integration and the ability to create AI images. Apple says it doesn't use customers' private data or their interactions to train its foundation models.
Yes, but: If people choose to sign into their ChatGPT Plus account they can access additional features, but then OpenAI's privacy policies apply.
The big picture: Many AI systems avoid using data personalized to the user, or at least avoid doing so by default.
Apple Intelligence is similarly personalized when it comes to generating images.
Between the lines: Apple hasn't just talked the talk when it comes to ensuring privacy. Apple Intelligence is designed to ensure as much work as possible is done on device.