What you might not have seen is, well, yourself caught in the crosshairs of the glasses’ camera.

Now, a new report—and a federal lawsuit that quickly followed—alleges the company is even less transparent than those thick lenses, claiming the company is quietly routing users’ footage to human workers overseas instead of its AI models.

These workers have seen everything from people undressing to sensitive financial documents, and it’s thanks to users who opt into data sharing for AI training purposes.

“In some videos you can see someone going to the toilet, or getting undressed.

I don’t think they know, because if they knew they wouldn’t be recording,” a worker noted, having seen video from the glasses.

In late February, Swedish publications Svenska Dagbladet and Göteborgs-Posten published an investigation into Meta’s AI training pipeline, finding Meta contractors in Kenya help train the artificial intelligence powering the glasses (comprising the Ray-Ban Meta Wayfarer (Gen 2), the Ray-Ban Display, and the Oakley Meta HSTN models).

What they saw was startling.

“We see everything, from living rooms to naked bodies,” a worker said in the report.

“Meta has that type of content in its databases.” Any user who opts into sharing data for AI training purposes effectively allows all parts of their life to be recorded, and then as a result, reviewed, either by the AIs it’s supposed to train or by the humans behind it.

That includes footage of people in bathrooms, undressing, and watching porn, and in at least one documented case, a pair of glasses left on a bedside table captured a partner who had never consented to being recorded.

Meta’s subcontractors—data annotators teaching the AI to interpret images by manually labeling content—also reported viewing users’ credit card numbers and financial documents.

At the time of the report’s release, Meta responded through a spokesperson, saying: “When people share content with Meta AI, like other companies we sometimes use contractors to review this data to improve people’s experience with the glasses, as stated in our privacy policy.

This data is first filtered to protect people’s privacy.” A class action begins The report triggered legal action.

On March 4, plaintiffs Gina Bartone and Mateo Canu filed a class action lawsuit against Meta Platforms (and glasses-maker Luxottica of America) accusing the companies of violating federal and state laws by failing to disclose that videos captured by the glasses are transmitted to servers and then to a Kenyan subcontractor for manual labeling.​ Referencing new privacy bills and regulations as result of the increase in AI and the surveillance economy, the suit says that “Meta knows this,” in reference....